WorldWideScience

Sample records for project simulation toolbox

  1. Object Oriented Toolbox for Modelling and Simulation of Dynamical Systems

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin; Wagner, Falko Jens; Thomsen, Per Grove

    1998-01-01

    This paper presents the results of an ongoing project, dealing with design and implementation of a simulation toolbox based on object oriented modelling techniques. The paper describes an experimental implementation of parts of such a toolbox in C++, and discusses the experiences drawn from that ...... that process. Essential to the work is the focus on simulation of complex dynamical systems, from modelling the single components/subsystems to building complete systemssuch a toolbox in C++, and discusses the experiences drawn from that process....

  2. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  3. 6 DOF Nonlinear AUV Simulation Toolbox

    Science.gov (United States)

    1997-01-01

    advantage of monitor is you can change the shared memory variables at any time. So it can be used for “ hardware in loop ” simulation. An editable monitor...for tuning controller. “ Hardware in loop ” simulation will be tested in recently. In the coming year, we will improve this simulation software. We

  4. An Open-Source Toolbox for PEM Fuel Cell Simulation

    Directory of Open Access Journals (Sweden)

    Jean-Paul Kone

    2018-05-01

    Full Text Available In this paper, an open-source toolbox that can be used to accurately predict the distribution of the major physical quantities that are transported within a proton exchange membrane (PEM fuel cell is presented. The toolbox has been developed using the Open Source Field Operation and Manipulation (OpenFOAM platform, which is an open-source computational fluid dynamics (CFD code. The base case results for the distribution of velocity, pressure, chemical species, Nernst potential, current density, and temperature are as expected. The plotted polarization curve was compared to the results from a numerical model and experimental data taken from the literature. The conducted simulations have generated a significant amount of data and information about the transport processes that are involved in the operation of a PEM fuel cell. The key role played by the concentration constant in shaping the cell polarization curve has been explored. The development of the present toolbox is in line with the objectives outlined in the International Energy Agency (IEA, Paris, France Advanced Fuel Cell Annex 37 that is devoted to developing open-source computational tools to facilitate fuel cell technologies. The work therefore serves as a basis for devising additional features that are not always feasible with a commercial code.

  5. SELANSI: a toolbox for simulation of stochastic gene regulatory networks.

    Science.gov (United States)

    Pájaro, Manuel; Otero-Muras, Irene; Vázquez, Carlos; Alonso, Antonio A

    2018-03-01

    Gene regulation is inherently stochastic. In many applications concerning Systems and Synthetic Biology such as the reverse engineering and the de novo design of genetic circuits, stochastic effects (yet potentially crucial) are often neglected due to the high computational cost of stochastic simulations. With advances in these fields there is an increasing need of tools providing accurate approximations of the stochastic dynamics of gene regulatory networks (GRNs) with reduced computational effort. This work presents SELANSI (SEmi-LAgrangian SImulation of GRNs), a software toolbox for the simulation of stochastic multidimensional gene regulatory networks. SELANSI exploits intrinsic structural properties of gene regulatory networks to accurately approximate the corresponding Chemical Master Equation with a partial integral differential equation that is solved by a semi-lagrangian method with high efficiency. Networks under consideration might involve multiple genes with self and cross regulations, in which genes can be regulated by different transcription factors. Moreover, the validity of the method is not restricted to a particular type of kinetics. The tool offers total flexibility regarding network topology, kinetics and parameterization, as well as simulation options. SELANSI runs under the MATLAB environment, and is available under GPLv3 license at https://sites.google.com/view/selansi. antonio@iim.csic.es. © The Author(s) 2017. Published by Oxford University Press.

  6. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...... gravity field, Earth magnetic field and eclipse. The structure and facilities within the toolbox are described and exemplified using a student satellite case (AAUSAT-II). The validity of developed models is confirmed by comparing the simulation results with the realistic data obtained from the Danish...

  7. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    Science.gov (United States)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  8. OSSIM wave-optics toolbox and its use to simulate AEOS

    Science.gov (United States)

    Smith, Carey A.; Forgham, James L.; Jones, Bruce W.; Jones, Kenneth D.

    2001-12-01

    OSSim (Optical System Simulation) is a simulation toolbox of optical and processing components. By using full wave-optics in the time-domain, OSSim simulates diffractive effects and control loop interactions missed by simpler analyses. OSSim also models the atmosphere, with user customizable turbulence strength, wind, and slew. This paper first presents 2 introductory examples: a simple 2-lens imaging system and a simple tilt-control system. Then it presents a simulation of the 3.67-meter AEOS (Advanced Electro-Optics System) telescope on Maui. The OSSim simulation agrees well with the AEOS experimental results.

  9. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  10. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.

    2011-02-25

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. Results: The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user\\'s models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. © The Author 2011. Published by Oxford University Press. All rights reserved.

  11. Fast modal simulation of paraxial optical systems: the MIST open source toolbox

    International Nuclear Information System (INIS)

    Vajente, Gabriele

    2013-01-01

    This paper presents a new approach to the simulation of optical laser systems in the paraxial approximation, with particular applications to interferometric gravitational wave detectors. The method presented here is based on a standard decomposition of the laser field in terms of Hermite–Gauss transverse modes. The innovative feature consists of a symbolic manipulation of the equations describing the field propagation. This approach allows a huge reduction in the computational time, especially when a large number of higher order modes is needed to properly simulate the system. The new algorithm has been implemented in an open source toolbox, called the MIST, based on the MATLAB® environment. The MIST has been developed and is being used in the framework of the design of advanced gravitational wave detectors. Examples from this field of application will be discussed to illustrate the capabilities and performance of the simulation tool. (paper)

  12. ARTS, the Atmospheric Radiative Transfer Simulator - version 2.2, the planetary toolbox edition

    Science.gov (United States)

    Buehler, Stefan A.; Mendrok, Jana; Eriksson, Patrick; Perrin, Agnès; Larsson, Richard; Lemke, Oliver

    2018-04-01

    This article describes the latest stable release (version 2.2) of the Atmospheric Radiative Transfer Simulator (ARTS), a public domain software for radiative transfer simulations in the thermal spectral range (microwave to infrared). The main feature of this release is a planetary toolbox that allows simulations for the planets Venus, Mars, and Jupiter, in addition to Earth. This required considerable model adaptations, most notably in the area of gaseous absorption calculations. Other new features are also described, notably radio link budgets (including the effect of Faraday rotation that changes the polarization state) and the treatment of Zeeman splitting for oxygen spectral lines. The latter is relevant, for example, for the various operational microwave satellite temperature sensors of the Advanced Microwave Sounding Unit (AMSU) family.

  13. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    Directory of Open Access Journals (Sweden)

    Michael James Hull

    2014-01-01

    Full Text Available The broad structure of a modelling study can often be explained over a cup of coffee, butconverting this high-level conceptual idea into graphs of the final simulation results may requiremany weeks of sitting at a computer. Although models themselves can be complex, oftenmany mental resources are wasted working around complexities of the software ecosystemsuch as fighting to manage files, interfacing between tools and data formats, finding mistakesin code or working out the units of variables. morphforge is a high-level, Python toolboxfor building and managing simulations of small populations of multicompartmental biophysicalmodel neurons. An entire in silico experiment, including the definition of neuronal morphologies,channel descriptions, stimuli, visualisation and analysis of results can be written within a singleshort Python script using high-level objects. Multiple independent simulations can be createdand run from a single script, allowing parameter spaces to be investigated. Consideration hasbeen given to the reuse of both algorithmic and parameterisable components to allow bothspecific and stochastic parameter variations. Some other features of the toolbox include: theautomatic generation of human-readable documentation (e. g. PDF-files about a simulation; thetransparent handling of different biophysical units; a novel mechanism for plotting simulationresults based on a system of tags; and an architecture that supports both the use of establishedformats for defining channels and synapses (e. g. MODL files, and the possibility to supportother libraries and standards easily. We hope that this toolbox will allow scientists to quicklybuild simulations of multicompartmental model neurons for research and serve as a platform forfurther tool development.

  14. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    Science.gov (United States)

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  15. Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities

    Science.gov (United States)

    Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.

    2015-12-01

    In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation

  16. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    Science.gov (United States)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment

  17. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB.

    Science.gov (United States)

    Klingbeil, Guido; Erban, Radek; Giles, Mike; Maini, Philip K

    2011-04-15

    The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user's models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. The software is open source under the GPL v3 and available at http://www.maths.ox.ac.uk/cmb/STOCHSIMGPU. The web site also contains supplementary information. klingbeil@maths.ox.ac.uk Supplementary data are available at Bioinformatics online.

  18. The Self-Powered Detector Simulation `MATiSSe' Toolbox applied to SPNDs for severe accident monitoring in PWRs

    Science.gov (United States)

    Barbot, Loïc; Villard, Jean-François; Fourrez, Stéphane; Pichon, Laurent; Makil, Hamid

    2018-01-01

    In the framework of the French National Research Agency program on nuclear safety and radioprotection, the `DIstributed Sensing for COrium Monitoring and Safety' project aims at developing innovative instrumentation for corium monitoring in case of severe accident in a Pressurized Water nuclear Reactor. Among others, a new under-vessel instrumentation based on Self-Powered Neutron Detectors is developed using a numerical simulation toolbox, named `MATiSSe'. The CEA Instrumentation Sensors and Dosimetry Lab developed MATiSSe since 2010 for Self-Powered Neutron Detectors material selection and geometry design, as well as for their respective partial neutron and gamma sensitivity calculations. MATiSSe is based on a comprehensive model of neutron and gamma interactions which take place in Selfpowered neutron detector components using the MCNP6 Monte Carlo code. As member of the project consortium, the THERMOCOAX SAS Company is currently manufacturing some instrumented pole prototypes to be tested in 2017. The full severe accident monitoring equipment, including the standalone low current acquisition system, will be tested during a joined CEA-THERMOCOAX experimental campaign in some realistic irradiation conditions, in the Slovenian TRIGA Mark II research reactor.

  19. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...

  20. C++ Toolbox for Object-Oriented Modeling and Dynamic Simulation of Physical Systems

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1999-01-01

    This paper presents the efforts made in an ongoing project that exploits the advantages of using object-oriented methodologies for describing and simulating dynamical systems. The background for this work is a search for new and better ways to simulate physical systems.......This paper presents the efforts made in an ongoing project that exploits the advantages of using object-oriented methodologies for describing and simulating dynamical systems. The background for this work is a search for new and better ways to simulate physical systems....

  1. Cementitious Barriers Partnership (CBP): Using the CBP Software Toolbox to Simulate Sulfate Attack and Carbonation of Concrete Structures - 13481

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, School of Engineering, CRESP, Nashville, TN 37235 (United States); Flach, G.; Langton, C.; Smith, F.G.III; Burns, H. [Savannah River National Laboratory, Aiken, SC 29808 (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy, Dorpsstraat 216, 1721BV Langedijk (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Westerduinweg 3, Petten (Netherlands); Seignette, P.F.A.B. [Energy Research Center of The Netherlands, Petten (Netherlands); Samson, E. [SIMCO Technologies, Inc., Quebec (Canada); Mallick, P.; Suttora, L. [U.S. Department of Energy, Washington, DC (United States); Esh, D.; Fuhrmann, M.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy Office of Tank Waste Management. The CBP project has developed a set of integrated modeling tools and leaching test methods to help improve understanding and prediction of the long-term hydraulic and chemical performance of cementitious materials used in nuclear applications. State-of-the-art modeling tools, including LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R}, were selected for their demonstrated abilities to simulate reactive transport and degradation in cementitious materials. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF), now adopted as part of the SW-846 RCRA methods, have been used to help make the link between modeling and experiment. Although each of the CBP tools has demonstrated utility as a standalone product, coupling the models over relevant spatial and temporal solution domains can provide more accurate predictions of cementitious materials behavior over relevant periods of performance. The LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} models were first linked to the GoldSim Monte Carlo simulator to better and more easily characterize model uncertainties and as a means to coupling the models allowing linking to broader performance assessment evaluations that use CBP results for a source term. Two important degradation scenarios were selected for initial demonstration: sulfate ingress / attack and carbonation of cementitious materials. When sufficient sulfate is present in the pore solution external to a concrete barrier, sulfate can diffuse into the concrete, react with the concrete solid phases, and cause cracking that significantly changes the transport and structural properties of the concrete. The penetration of gaseous carbon dioxide within partially saturated concrete usually initiates a series of carbonation

  2. Cementitious Barriers Partnership (CBP): Using the CBP Software Toolbox to Simulate Sulfate Attack and Carbonation of Concrete Structures - 13481

    International Nuclear Information System (INIS)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S.; Flach, G.; Langton, C.; Smith, F.G.III; Burns, H.; Van der Sloot, H.; Meeussen, J.C.L.; Seignette, P.F.A.B.; Samson, E.; Mallick, P.; Suttora, L.; Esh, D.; Fuhrmann, M.; Philip, J.

    2013-01-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy Office of Tank Waste Management. The CBP project has developed a set of integrated modeling tools and leaching test methods to help improve understanding and prediction of the long-term hydraulic and chemical performance of cementitious materials used in nuclear applications. State-of-the-art modeling tools, including LeachXS TM /ORCHESTRA and STADIUM R , were selected for their demonstrated abilities to simulate reactive transport and degradation in cementitious materials. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF), now adopted as part of the SW-846 RCRA methods, have been used to help make the link between modeling and experiment. Although each of the CBP tools has demonstrated utility as a standalone product, coupling the models over relevant spatial and temporal solution domains can provide more accurate predictions of cementitious materials behavior over relevant periods of performance. The LeachXS TM /ORCHESTRA and STADIUM R models were first linked to the GoldSim Monte Carlo simulator to better and more easily characterize model uncertainties and as a means to coupling the models allowing linking to broader performance assessment evaluations that use CBP results for a source term. Two important degradation scenarios were selected for initial demonstration: sulfate ingress / attack and carbonation of cementitious materials. When sufficient sulfate is present in the pore solution external to a concrete barrier, sulfate can diffuse into the concrete, react with the concrete solid phases, and cause cracking that significantly changes the transport and structural properties of the concrete. The penetration of gaseous carbon dioxide within partially saturated concrete usually initiates a series of carbonation reactions with

  3. Orfeo Toolbox: A Free And Open Source Solution For Research And Operational Remote Sensing Projects

    Science.gov (United States)

    Savinaud, Mickael; OTB-CS Team

    2013-12-01

    The free and open source solution, Orfeo ToolBox (OTB), offers the possibility to deal with large data processing. This library designed by CNES in the frame of the ORFEO accompaniment program to promote use of Pleiades data and other VHR data offers now a larger number of applications designed to end users. Due to its modular design, OTB is now used in different context from R&D studies to operational chain.

  4. Towards a binaural modelling toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Culling, John F.; Dau, Torsten

    2011-01-01

    The Auditory Modelling Toolbox (AMToolbox) is a new Matlab / Octave toolbox for developing and applying auditory perceptual models and in particular binaural models. The philosophy behind the project is that the models should be implemented in a consistent manner, well documented and user...

  5. GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations

    Science.gov (United States)

    Antoine, Xavier; Duboscq, Romain

    2015-08-01

    GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.

  6. Project Schedule Simulation

    DEFF Research Database (Denmark)

    Mizouni, Rabeb; Lazarova-Molnar, Sanja

    2015-01-01

    overrun both their budget and time. To improve the quality of initial project plans, we show in this paper the importance of (1) reflecting features’ priorities/risk in task schedules and (2) considering uncertainties related to human factors in plan schedules. To make simulation tasks reflect features......’ priority as well as multimodal team allocation, enhanced project schedules (EPS), where remedial actions scenarios (RAS) are added, were introduced. They reflect potential schedule modifications in case of uncertainties and promote a dynamic sequencing of involved tasks rather than the static conventional...... this document as an instruction set. The electronic file of your paper will be formatted further at Journal of Software. Define all symbols used in the abstract. Do not cite references in the abstract. Do not delete the blank line immediately above the abstract; it sets the footnote at the bottom of this column....

  7. Reprocessing Close Range Terrestrial and Uav Photogrammetric Projects with the Dbat Toolbox for Independent Verification and Quality Control

    Science.gov (United States)

    Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.

    2017-11-01

    Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  8. REPROCESSING CLOSE RANGE TERRESTRIAL AND UAV PHOTOGRAMMETRIC PROJECTS WITH THE DBAT TOOLBOX FOR INDEPENDENT VERIFICATION AND QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    A. Murtiyoso

    2017-11-01

    Full Text Available Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles, close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  9. D and D Toolbox Project - Technology Demonstration of Fixatives Applied to Hot Cell Facilities via Remote Sprayer Platforms

    International Nuclear Information System (INIS)

    Lagos, L.; Shoffner, P.; Espinosa, E.; Pena, G.; Kirk, P.; Conley, T.

    2009-01-01

    The objective of the US Department of Energy Office of Environmental Management's (DOE-EM's) D and D Toolbox Project is to use an integrated systems approach to develop a suite of decontamination and decommissioning (D and D) technologies, a D and D toolbox, that can be readily used across the DOE complex to improve safety, reduce technical risks, and limit uncertainty within D and D operations. Florida International University's Applied Research Center (FIU-ARC) is supporting this initiative by identifying technologies suitable to meet specific facility D and D requirements, assessing the readiness of those technologies for field deployment, and conducting technology demonstrations of selected technologies at FIU-ARC facilities in Miami, Florida. To meet the technology gap challenge for a technology to remotely apply strippable/fixative coatings, FIU-ARC identified and demonstrated of a remote fixative sprayer platform. During this process, FIU-ARC worked closely with the Oak Ridge National Laboratory in the selection of typical fixatives and in the design of a hot cell mockup facility for demonstrations at FIUARC. For this demonstration and for future demonstrations, FIU-ARC built a hot cell mockup facility at the FIU-ARC Technology Demonstration/Evaluation site in Miami, Florida. FIU-ARC selected the International Climbing Machines' (ICM's) Robotic Climber to perform this technology demonstration. The selected technology was demonstrated at the hot cell mockup facility at FIU-ARC during the week of November 10, 2008. Fixative products typically used inside hot cells were investigated and selected for this remote application. The fixatives tested included Sherwin Williams' Promar 200 and DTM paints and Bartlett's Polymeric Barrier System (PBS). The technology evaluation documented the ability of the remote system to spray fixative products on horizontal and vertical concrete surfaces. The technology performance, cost, and health and safety issues were evaluated

  10. Broadview Radar Altimetry Toolbox

    Science.gov (United States)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific

  11. Toolbox for uncertainty; Introduction of adaptive heuristics as strategies for project decision making

    DEFF Research Database (Denmark)

    Stingl, Verena; Geraldi, Joana

    2017-01-01

    This article presents adaptive heuristics as an alternative approach to navigate uncertainty in project decision-making. Adaptive heuristic are a class of simple decision strategies that have received only scant attention in project studies. Yet, they can strive in contexts of high uncertainty...... they are ‘ecologically rational’. The model builds on the individual definitions of ecological rationality and organizes them according to two types of uncertainty (‘knowable’ and ‘unknowable’). Decision problems and heuristics are furthermore grouped by decision task (choice and judgement). The article discusses...... and limited information, which are the typical project decision context. This article develops a conceptual model that supports a systematic connection between adaptive heuristics and project decisions. Individual adaptive heuristics succeed only in specific decision environments, in which...

  12. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  13. A Simulation Platform To Model, Optimize And Design Wind Turbines. The Matlab/Simulink Toolbox

    Directory of Open Access Journals (Sweden)

    Anca Daniela HANSEN

    2002-12-01

    Full Text Available In the last years Matlab / Simulink® has become the most used software for modeling and simulation of dynamic systems. Wind energy conversion systems are for example such systems, containing subsystems with different ranges of the time constants: wind, turbine, generator, power electronics, transformer and grid. The electrical generator and the power converter need the smallest simulation step and therefore, these blocks decide the simulation speed. This paper presents a new and integrated simulation platform for modeling, optimizing and designing wind turbines. The platform contains different simulation tools: Matlab / Simulink - used as basic modeling tool, HAWC, DIgSilent and Saber.

  14. System design through Matlab, control toolbox and Simulink

    CERN Document Server

    Singh, Krishna K

    2001-01-01

    MATLAB , a software package developed by Math Works, Inc. is powerful, versatile and interactive software for scientific and technical computations including simulations. Specialised toolboxes provided with several built-in functions are a special feature of MATLAB . This book titled System Design through MATLAB , Control Toolbox and SIMULINK aims at getting the reader started with computations and simulations in system engineering quickly and easily and then proceeds to build concepts for advanced computations and simulations that includes the control and compensation of systems. Simulation through SIMULINK has also been described to allow the reader to get the feel of the real world situation. This book is appropriate for undergraduate students undergoing final semester of their project work, postgraduate students who have MATLAB integrated in their course or wish to take up simulation problem in the area of system engineering for their dissertation work and research scholars for whom MATLABÊ

  15. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.; Erban, R.; Giles, M.; Maini, P. K.

    2011-01-01

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new

  16. MOtoNMS : A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation

    NARCIS (Netherlands)

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Background: Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust

  17. Virtual toolbox

    Science.gov (United States)

    Jacobus, Charles J.; Jacobus, Heidi N.; Mitchell, Brian T.; Riggs, A. J.; Taylor, Mark J.

    1993-04-01

    At least three of the five senses must be fully addressed in a successful virtual reality (VR) system. Sight, sound, and touch are the most critical elements for the creation of the illusion of presence. Since humans depend so much on sight to collect information about their environment, this area has been the focus of much of the prior art in virtual reality, however, it is also crucial that we provide facilities for force, torque, and touch reflection, and sound replay and 3-D localization. In this paper we present a sampling of hardware and software in the virtual environment maker's `toolbox' which can support rapidly building up of customized VR systems. We provide demonstrative examples of how some of the tools work and we speculate about VR applications and future technology needs.

  18. Projective simulation for artificial intelligence

    Science.gov (United States)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  19. Visual NNet: An Educational ANN's Simulation Environment Reusing Matlab Neural Networks Toolbox

    Science.gov (United States)

    Garcia-Roselló, Emilio; González-Dacosta, Jacinto; Lado, Maria J.; Méndez, Arturo J.; Garcia Pérez-Schofield, Baltasar; Ferrer, Fátima

    2011-01-01

    Artificial Neural Networks (ANN's) are nowadays a common subject in different curricula of graduate and postgraduate studies. Due to the complex algorithms involved and the dynamic nature of ANN's, simulation software has been commonly used to teach this subject. This software has usually been developed specifically for learning purposes, because…

  20. Regensim – Matlab toolbox for renewable energy sources modelling and simulation

    Directory of Open Access Journals (Sweden)

    Cristian Dragoş Dumitru

    2011-12-01

    Full Text Available This paper deals with the implementation and development of a Matlab Simulink library named RegenSim designed for modeling, simulations and analysis of real hybrid solarwind-hydro systems connected to local grids. Blocks like wind generators, hydro generators, solar photovoltaic modules and accumulators are implemented. The main objective is the study of the hybrid power system behavior, which allows employing renewable and variable in time energy sources while providing a continuous supply.

  1. Fusion Simulation Project Workshop Report

    Science.gov (United States)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  2. Bohunice Simulator Data Collection Project

    International Nuclear Information System (INIS)

    Cillik, Ivan; Prochaska, Jan

    2002-01-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  3. Bohunice Simulator Data Collection Project

    International Nuclear Information System (INIS)

    Cillik, I.; Prochaska, J.

    2002-01-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project, training of V-2 control room crews was performed at VUJE Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for various operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  4. User's manual for Ecolego Toolbox and the Discretization Block

    International Nuclear Information System (INIS)

    Broed, Robert; Shulan Xu

    2008-03-01

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  5. The simulation of a non-linear unstable system of motion, utilising the Solid-Works and Matlab/Simulink extended libraries/toolboxes

    Directory of Open Access Journals (Sweden)

    Zátopek Jiří

    2016-01-01

    Full Text Available This text discusses the use and integration of various support software tools for the purpose of designing the motion control law governing mechanical structures with strongly non-linear behaviour. The detailed mathematical model is derived using Lagrange Equations of the Second Type. The physical model was designed by using SolidWorks 3D CAD software and a SimMechanics library. It extends Simulink with modelling tools for the simulation of mechanical “multi-domain” physical systems. The visualization of Simulink outputs is performed using the 3D Animation toolbox. Control law - designed on the basis of the mathematical model, is tested for both models (i.e. mathematical and physical and the regulatory processes’ results are compared.

  6. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    2002-01-01

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  7. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    Science.gov (United States)

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  8. PREVIEW behavior modification intervention toolbox (PREMIT: a study protocol for a psychological element of a multicenter project

    Directory of Open Access Journals (Sweden)

    Daniela Kahlert

    2016-08-01

    Full Text Available Background: Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. Objective: To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counselling approach. Methods: The program development took five progressive steps, in line with the Public Health Action Cycle: (1 Summing-up the intervention goal(s, target group and the setting, (2 uncovering the generative psychological mechanisms, (3 identifying behavior change techniques and tools, (4 preparing for evaluation and (5 implementing the intervention and assuring quality. Results: PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major ‘product’ of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participant’s behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and train-the-trainer workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre

  9. PREVIEW Behavior Modification Intervention Toolbox (PREMIT): A Study Protocol for a Psychological Element of a Multicenter Project.

    Science.gov (United States)

    Kahlert, Daniela; Unyi-Reicherz, Annelie; Stratton, Gareth; Meinert Larsen, Thomas; Fogelholm, Mikael; Raben, Anne; Schlicht, Wolfgang

    2016-01-01

    Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT) that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counseling approach. The program development took five progressive steps, in line with the Public Health Action Cycle: (1) Summing-up the intervention goal(s), target group and the setting, (2) uncovering the generative psychological mechanisms, (3) identifying behavior change techniques and tools, (4) preparing for evaluation and (5) implementing the intervention and assuring quality. PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major "product" of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participants' behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and "train-the-trainer" workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre-diabetic people.

  10. The ABRAVIBE toolbox for teaching vibration analysis and structural dynamics

    DEFF Research Database (Denmark)

    Brandt, A.

    2013-01-01

    , a MATLAB toolbox (the ABRAVIBE toolbox) has been developed as an accompanying toolbox for the recent book "Noise and Vibration Analysis" by the author. This free, open software, published under GNU Public License, can be used with GNU Octave, if an entirely free software platform is wanted, with a few...... functional limitations. The toolbox includes functionality for simulation of mechanical models as well as advanced analysis such as time series analysis, spectral analysis, frequency response and correlation function estimation, modal parameter extraction, and rotating machinery analysis (order tracking...

  11. Development of a Twin-Spool Turbofan Engine Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3% of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  12. Development of a Twin-spool Turbofan Engine Simulation Using the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Johathan S.

    2014-01-01

    The Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3 of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  13. Sustainable knowledge development across cultural boundaries: Experiences from the EU-project SILMAS (Toolbox for conflict solving instruments in Alpine Lake Management)

    Science.gov (United States)

    Fegerl, Michael; Wieden, Wilfried

    2013-04-01

    Increasingly people have to communicate knowledge across cultural and language boundaries. Even though recent technologies offer powerful communication facilities people often feel confronted with barriers which clearly reduce their chances of making their interaction a success. Concrete evidence concerning such problems derives from a number of projects, where generated knowledge often results in dead-end products. In the Alpine Space-project SILMAS (Sustainable Instruments for Lake Management in Alpine Space), in which both authors were involved, a special approach (syneris® ) was taken to avoid this problem and to manage project knowledge in sustainable form. Under this approach knowledge input and output are handled interactively: Relevant knowledge can be developed continuously and users can always access the latest state of expertise. Resort to the respective tools and procedures can also assist in closing knowledge gaps and in developing innovative responses to familiar or novel problems. This contribution intends to describe possible ways and means which have been found to increase the chances of success of knowledge communication across cultural boundaries. The process of trans-cultural discussions of experts to find a standardized solution is highlighted as well as the problem of dissemination of expert knowledge to variant stakeholders. Finally lessons learned are made accessible, where a main task lies in the creation of a tool box for conflict solving instruments, as a demonstrable result of the project and for the time thereafter. The interactive web-based toolbox enables lake managers to access best practice instruments in standardized, explicit and cross-linguistic form.

  14. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  15. Fessenheim simulator for OECD Halden Reactor Project

    International Nuclear Information System (INIS)

    Oudot, G.; Bonnissent, B.

    1998-01-01

    A full scope NPP simulator is presently under manufacture by THOMSON TRAINING and SIMULATION (TTandS) in Cergy (France) for the OECD HALDEN REACTOR PROJECT. The reference plant of this simulator is the Fessenheim CP0 PWR power plant operated by the French utility EDF, for which TTandS has delivered a full scope training simulator in mid 1997. The simulator for HALDEN Reactor Project is based on a software duplication of the Fessenheim simulator delivered to EDF, ported on the most recent computers and O.S. available. This paper outlines the main features of this new simulator generation which reaps benefit of the advanced technologies of the SIPA design simulator introduced inside a full scope simulator. This kind of simulator is in fact the synthesis between training and design simulators and offers therefore added technical capabilities well suited to HALDEN needs. (author)

  16. Air Sensor Toolbox

    Science.gov (United States)

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  17. Project managing your simulator DCS upgrade

    International Nuclear Information System (INIS)

    Carr, S.

    2006-01-01

    The intention of this paper is to provide helpful information and tips for the purchaser with regard to the project management of a DCS upgrade for an existing nuclear power station operator-training simulator. This paper was written shortly after STS Powergen completed two nuclear power station simulator DCS projects in the USA. Areas covered by this paper are: - Contractual documents and arrangements; - Licence and Escrow agreements; - Liquidated damages; - Project management; - Project schedules and resources; - Monitoring progress; - Defect reporting; - DCS automation code; - Access to proprietary information; - Tips for project meetings; - Testing; - Cultural issues; - Training

  18. A conceptual toolbox for designing CSCW applications

    DEFF Research Database (Denmark)

    Bødker, Susanne; Christiansen, Ellen

    1995-01-01

    This paper presents a conceptual toolbox, developed to support the design of CSCW applications in a large Esprit project, EuroCODE. Here, several groups of designers work to investigate computer support for cooperative work in large use organizations, at the same time as they work to develop...... an open development platform for CSCW applications. The conceptual toolbox has been developed to support communication in and among these design groups, between designers and users and in future use of the open development platform. Rejecting the idea that one may design from a framework describing CSCW......, the toolbox aims to support design by doing and help bridging between work with users, technical design, and insights gained from theoretical and empirical CSCW research....

  19. DACE - A Matlab Kriging Toolbox

    DEFF Research Database (Denmark)

    2002-01-01

    DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models.......DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models....

  20. Simulation of investment returns of toll projects.

    Science.gov (United States)

    2013-08-01

    This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...

  1. Particle Swarm Optimization Toolbox

    Science.gov (United States)

    Grant, Michael J.

    2010-01-01

    The Particle Swarm Optimization Toolbox is a library of evolutionary optimization tools developed in the MATLAB environment. The algorithms contained in the library include a genetic algorithm (GA), a single-objective particle swarm optimizer (SOPSO), and a multi-objective particle swarm optimizer (MOPSO). Development focused on both the SOPSO and MOPSO. A GA was included mainly for comparison purposes, and the particle swarm optimizers appeared to perform better for a wide variety of optimization problems. All algorithms are capable of performing unconstrained and constrained optimization. The particle swarm optimizers are capable of performing single and multi-objective optimization. The SOPSO and MOPSO algorithms are based on swarming theory and bird-flocking patterns to search the trade space for the optimal solution or optimal trade in competing objectives. The MOPSO generates Pareto fronts for objectives that are in competition. A GA, based on Darwin evolutionary theory, is also included in the library. The GA consists of individuals that form a population in the design space. The population mates to form offspring at new locations in the design space. These offspring contain traits from both of the parents. The algorithm is based on this combination of traits from parents to hopefully provide an improved solution than either of the original parents. As the algorithm progresses, individuals that hold these optimal traits will emerge as the optimal solutions. Due to the generic design of all optimization algorithms, each algorithm interfaces with a user-supplied objective function. This function serves as a "black-box" to the optimizers in which the only purpose of this function is to evaluate solutions provided by the optimizers. Hence, the user-supplied function can be numerical simulations, analytical functions, etc., since the specific detail of this function is of no concern to the optimizer. These algorithms were originally developed to support entry

  2. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  3. ARC Code TI: X-Plane Communications Toolbox (XPC)

    Data.gov (United States)

    National Aeronautics and Space Administration — The X-Plane Communications Toolbox (XPC) is an open source research tool used to interact with the commercial flight simulator software X-Plane. XPC allows users to...

  4. Overcoming challenges: The Koeberg simulator project

    International Nuclear Information System (INIS)

    Marin, J: Bruchental, S.; Lagerwall, J.R.

    2006-01-01

    As the nuclear power simulation training industry matures, the needs of the individual training facilities are becoming increasingly diverse. To respond to this trend, simulation vendors must offer a greater level of customization, and as such, are facing increasingly challenging projects. In 2002, CAE embarked on such a project for South Africas Koeberg nuclear power station. This simulator upgrade project, carried out in two phases, served as a unique exercise in product customization. In the first phase, CAE replaced the simulation servers, the software environment, and re-hosted the existing plant models. In the second phase, CAE replaced several of the existing thermal-hydraulics models to improve simulator fidelity. Throughout this project, CAE overcame a series of challenges stemming from project-specific requirements. In fact, the retention of the legacy software maintenance tools, the preservation of the instructor station package, and the interfacing with the existing hardware panel elements presented a number of opportunities for innovation. In the end, CAE overcame the challenges and acquired invaluable experience in doing so. (author)

  5. Presentation of the International Building Physics Toolbox for Simulink

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Sasic Kalagasidis, Angela; Nielsen, Toke Rammer

    2003-01-01

    The international building physics toolbox (IBPT) is a software library specially constructed for HAM system analysis in building physics. The toolbox is constructed as a modular structure of the standard building elements using the graphical programming language Simulink. Two research groups have...... participated in this project. In order to enable the development of the toolbox, a common modelling platform was defined: a set of unique communication signals, material database and documentation protocol. The IBPT is open source and publicly available on the Internet. Any researcher and student can use...

  6. Projective Simulation compared to reinforcement learning

    OpenAIRE

    Bjerland, Øystein Førsund

    2015-01-01

    This thesis explores the model of projective simulation (PS), a novel approach for an artificial intelligence (AI) agent. The model of PS learns by interacting with the environment it is situated in, and allows for simulating actions before real action is taken. The action selection is based on a random walk through the episodic & compositional memory (ECM), which is a network of clips that represent previous experienced percepts. The network takes percepts as inpu...

  7. Sentinel-3 SAR Altimetry Toolbox

    Science.gov (United States)

    Benveniste, Jerome; Lucas, Bruno; DInardo, Salvatore

    2015-04-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage of ERS-2 and Envisat, and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the two Sentinels is expected to be launched in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth

  8. ICT: isotope correction toolbox.

    Science.gov (United States)

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Participative simulation in the PSIM project

    NARCIS (Netherlands)

    Eijnatten, F.M. van; Vink, P.

    2002-01-01

    This chapter describes the background and objectives ofthe IST project Participative Simulation environment for Integral Manufacturing enterprise renewal (PSIM). In the short run, PSIM aims to address the key issues that have to be resolved before a manufacturing renewal can be implemented. PSIM is

  10. ESA Atmospheric Toolbox

    Science.gov (United States)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and

  11. User's manual for Ecolego Toolbox and the Discretization Block

    Energy Technology Data Exchange (ETDEWEB)

    Broed, Robert (Facilia consulting AB (Sweden)); Shulan Xu (Swedish Radiation Protection Authority, Stockholm (Sweden))

    2008-03-15

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  12. Phasor Simulator for Operator Training Project

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, Jim [Electric Power Group, Llc, Pasadena, CA (United States)

    2016-09-14

    Synchrophasor systems are being deployed in power systems throughout the North American Power Grid and there are plans to integrate this technology and its associated tools into Independent System Operator (ISO)/utility control room operations. A pre-requisite to using synchrophasor technologies in control rooms is for operators to obtain training and understand how to use this technology in real-time situations. The Phasor Simulator for Operator Training (PSOT) project objective was to develop, deploy and demonstrate a pre-commercial training simulator for operators on the use of this technology and to promote acceptance of the technology in utility and ISO/Regional Transmission Owner (RTO) control centers.

  13. Ubuntu Linux toolbox

    CERN Document Server

    Negus, Christopher

    2012-01-01

    This bestseller from Linux guru Chris Negus is packed with an array of new and revised material As a longstanding bestseller, Ubuntu Linux Toolbox has taught you how to get the most out Ubuntu, the world?s most popular Linux distribution. With this eagerly anticipated new edition, Christopher Negus returns with a host of new and expanded coverage on tools for managing file systems, ways to connect to networks, techniques for securing Ubuntu systems, and a look at the latest Long Term Support (LTS) release of Ubuntu, all aimed at getting you up and running with Ubuntu Linux quickly.

  14. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  15. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  16. Outline of the earth simulator project

    International Nuclear Information System (INIS)

    Tani, Keiji

    2000-01-01

    The Science and Technology Agency of Japan has proposed a project to promote studies for global change prediction by an integrated three-in-one research and development approach: earth observation, basic research, and computer simulation. As part of the project, we are developing an ultra-fast computer, the 'Earth Simulator', with a sustained speed of more than 5 TFLOPS for an atmospheric circulation code. The 'Earth Simulator' is a MIMD type distributed memory parallel system in which 640 processor nodes are connected via fast single-stage crossbar network. Earth node consists of 8 vector-type arithmetic processors which are tightly connected via shared memory. The peak performance of the total system is 40 TFLOPS. As part of the development of basic software system, we are developing an operation supporting software system what is called a 'center routine'. We are going to use an archival system as a main storage of user files. Therefore, the most important function of the center routine is the optimal scheduling of not only submitted batch jobs but also user files necessary for them. All the design and R and D works for both hardware and basic software systems have been completed during the last three fiscal years, FY97, 98 and 99. The manufacture of the hardware system and the development of the center routine are underway. Facilities necessary for the Earth Simulator including buildings are also under construction. The total system will be completed in the spring of 2002. (author)

  17. ESA's Multi-mission Sentinel-1 Toolbox

    Science.gov (United States)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for

  18. A Module for Graphical Display of Model Results with the CBP Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide a basis for further development of the CBP Toolbox.

  19. Background simulation for the GENIUS project

    International Nuclear Information System (INIS)

    Ponkratenko, O.A.; Tretyak, V.I.; Zdesenko, Yu.G.

    1999-01-01

    The background simulations for the GENIUS experiment were performed with the help of GEANT 3.21 package and event generator DECAY 4.Contributions from the cosmogenic activity produced in the Ge detectors and from its radioactive impurities as well as from contamination of the liquid nitrogen and other materials were calculated.External gamma and neutron background were taking into consideration also.The results of calculations evidently show feasibility of the GENIUS project,which can substantially promote development of modern astroparticle physics

  20. Smoke Ready Toolbox for Wildfires

    Science.gov (United States)

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  1. Online Simulation of Radiation Track Structure Project

    Science.gov (United States)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  2. Integrated Budget Office Toolbox

    Science.gov (United States)

    Rushing, Douglas A.; Blakeley, Chris; Chapman, Gerry; Robertson, Bill; Horton, Allison; Besser, Thomas; McCarthy, Debbie

    2010-01-01

    The Integrated Budget Office Toolbox (IBOT) combines budgeting, resource allocation, organizational funding, and reporting features in an automated, integrated tool that provides data from a single source for Johnson Space Center (JSC) personnel. Using a common interface, concurrent users can utilize the data without compromising its integrity. IBOT tracks planning changes and updates throughout the year using both phasing and POP-related (program-operating-plan-related) budget information for the current year, and up to six years out. Separating lump-sum funds received from HQ (Headquarters) into separate labor, travel, procurement, Center G&A (general & administrative), and servicepool categories, IBOT creates a script that significantly reduces manual input time. IBOT also manages the movement of travel and procurement funds down to the organizational level and, using its integrated funds management feature, helps better track funding at lower levels. Third-party software is used to create integrated reports in IBOT that can be generated for plans, actuals, funds received, and other combinations of data that are currently maintained in the centralized format. Based on Microsoft SQL, IBOT incorporates generic budget processes, is transportable, and is economical to deploy and support.

  3. The GEM Detector projective alignment simulation system

    International Nuclear Information System (INIS)

    Wuest, C.R.; Belser, F.C.; Holdener, F.R.; Roeben, M.D.; Paradiso, J.A.; Mitselmakher, G.; Ostapchuk, A.; Pier-Amory, J.

    1993-01-01

    Precision position knowledge (< 25 microns RMS) of the GEM Detector muon system at the Superconducting Super Collider Laboratory (SSCL) is an important physics requirement necessary to minimize sagitta error in detecting and tracking high energy muons that are deflected by the magnetic field within the GEM Detector. To validate the concept of the sagitta correction function determined by projective alignment of the muon detectors (Cathode Strip Chambers or CSCs), the basis of the proposed GEM alignment scheme, a facility, called the ''Alignment Test Stand'' (ATS), is being constructed. This system simulates the environment that the CSCs and chamber alignment systems are expected to experience in the GEM Detector, albeit without the 0.8 T magnetic field and radiation environment. The ATS experimental program will allow systematic study and characterization of the projective alignment approach, as well as general mechanical engineering of muon chamber mounting concepts, positioning systems and study of the mechanical behavior of the proposed 6 layer CSCs. The ATS will consist of a stable local coordinate system in which mock-ups of muon chambers (i.e., non-working mechanical analogs, representing the three superlayers of a selected barrel and endcap alignment tower) are implemented, together with a sufficient number of alignment monitors to overdetermine the sagitta correction function, providing a self-consistency check. This paper describes the approach to be used for the alignment of the GEM muon system, the design of the ATS, and the experiments to be conducted using the ATS

  4. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  5. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  6. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Erickson, S.A.

    1984-01-01

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project

  7. The GMT/MATLAB Toolbox

    Science.gov (United States)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  8. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  9. SIMULATION OF ORGANIZATIONAL ILLNESSES IN PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Денис Антонович ХАРИТОНОВ

    2015-05-01

    Full Text Available The article examined the fractal model of organizational diagnosis of pathologies in project management development. The proposed fractal model based on the competency approach to project management and allows evaluating the pathology of the project-oriented organizations.

  10. FATRAS - the ATLAS Fast Track Simulation project

    NARCIS (Netherlands)

    Mechnich, J.

    2011-01-01

    The Monte Carlo simulation of the detector response is an integral component of any analysis performed with data from the LHC experiments. As these simulated data sets must be both large and precise, their production is a CPU-intensive task. ATLAS has developed full and fast detector simulation

  11. A toolbox for European judges

    NARCIS (Netherlands)

    Hesselink, M.W.

    2011-01-01

    The forthcoming instrument on European contract law, be it in the shape of an optional code for cross-border contracts or as an official toolbox for the European legislator, is likely to have a spill-over effect on private law adjudication in Europe. Judges will have no great difficulty in finding

  12. Extend your toolbox with R

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    R seems to be ideal tool for visualising your data as well as practically all other data-related tasks. Learn how to start with R as it is worth to be included not only in statistician's or data scientist's toolboxes.

  13. The Life Cycle Analysis Toolbox

    International Nuclear Information System (INIS)

    Bishop, L.; Tonn, B.E.; Williams, K.A.; Yerace, P.; Yuracko, K.L.

    1999-01-01

    The life cycle analysis toolbox is a valuable integration of decision-making tools and supporting materials developed by Oak Ridge National Laboratory (ORNL) to help Department of Energy managers improve environmental quality, reduce costs, and minimize risk. The toolbox provides decision-makers access to a wide variety of proven tools for pollution prevention (P2) and waste minimization (WMin), as well as ORNL expertise to select from this toolbox exactly the right tool to solve any given P2/WMin problem. The central element of the toolbox is a multiple criteria approach to life cycle analysis developed specifically to aid P2/WMin decision-making. ORNL has developed numerous tools that support this life cycle analysis approach. Tools are available to help model P2/WMin processes, estimate human health risks, estimate costs, and represent and manipulate uncertainties. Tools are available to help document P2/WMin decision-making and implement programs. Tools are also available to help track potential future environmental regulations that could impact P2/WMin programs and current regulations that must be followed. An Internet-site will provide broad access to the tools

  14. GOCE user toolbox and tutorial

    DEFF Research Database (Denmark)

    Knudsen, Per; Benveniste, Jerome

    2011-01-01

    consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made...

  15. Full scope upgrade project for the Fermi 2 simulator

    International Nuclear Information System (INIS)

    Bollacasa, D.; Gonsalves, J.B.; Newcomb, P.C.

    1994-01-01

    The Detroit Edison company (DECO) concentrated the Simulation Division of Asea Brown Boveri (ABB) to perform a full scope upgrade of the Fermi 2 simulator. The Fermi 2 plant is a BWR 6 generation Nuclear Steam Supply System (NSSS). The project included the complete replacement of the existing simulation model sofware with ABB's high fidelity BWR models, addition of an advanced instructor station facility and new simulation computers. Also provided on the project were ABB's advanced simulation environment (CETRAN), a comprehensive configuration management system based on a modern relational database system and a new computer interface to the input/output system. (8 refs., 2 figs.)

  16. Object-oriented Matlab adaptive optics toolbox

    Science.gov (United States)

    Conan, R.; Correia, C.

    2014-08-01

    Object-Oriented Matlab Adaptive Optics (OOMAO) is a Matlab toolbox dedicated to Adaptive Optics (AO) systems. OOMAO is based on a small set of classes representing the source, atmosphere, telescope, wavefront sensor, Deformable Mirror (DM) and an imager of an AO system. This simple set of classes allows simulating Natural Guide Star (NGS) and Laser Guide Star (LGS) Single Conjugate AO (SCAO) and tomography AO systems on telescopes up to the size of the Extremely Large Telescopes (ELT). The discrete phase screens that make the atmosphere model can be of infinite size, useful for modeling system performance on large time scales. OOMAO comes with its own parametric influence function model to emulate different types of DMs. The cone effect, altitude thickness and intensity profile of LGSs are also reproduced. Both modal and zonal modeling approach are implemented. OOMAO has also an extensive library of theoretical expressions to evaluate the statistical properties of turbulence wavefronts. The main design characteristics of the OOMAO toolbox are object-oriented modularity, vectorized code and transparent parallel computing. OOMAO has been used to simulate and to design the Multi-Object AO prototype Raven at the Subaru telescope and the Laser Tomography AO system of the Giant Magellan Telescope. In this paper, a Laser Tomography AO system on an ELT is simulated with OOMAO. In the first part, we set-up the class parameters and we link the instantiated objects to create the source optical path. Then we build the tomographic reconstructor and write the script for the pseudo-open-loop controller.

  17. Wyrm: A Brain-Computer Interface Toolbox in Python.

    Science.gov (United States)

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  18. Matlab laser toolbox

    NARCIS (Netherlands)

    Römer, Gerardus Richardus, Bernardus, Engelina; Huis in 't Veld, Bert; Schmidt, M.; Vollertsen, F.; Geiger, M.

    2010-01-01

    Matlab® is a program for numeric computation, simulation and visualization, developed by The Mathworks, Inc. It is used heavily in education, research, and industry, for solving general, as well as application area-specific problems, that arise in various disciplines. For this purpose Matlab has

  19. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  20. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  1. Wave data processing toolbox manual

    Science.gov (United States)

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata

  2. The Linear Time Frequency Analysis Toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Torrésani, Bruno; Balazs, Peter

    2011-01-01

    The Linear Time Frequency Analysis Toolbox is a Matlab/Octave toolbox for computational time-frequency analysis. It is intended both as an educational and computational tool. The toolbox provides the basic Gabor, Wilson and MDCT transform along with routines for constructing windows (lter...... prototypes) and routines for manipulating coe cients. It also provides a bunch of demo scripts devoted either to demonstrating the main functions of the toolbox, or to exemplify their use in specic signal processing applications. In this paper we describe the used algorithms, their mathematical background...

  3. The SCAR project - accidental thermal-hydraulics: from the simulation to the simulators

    International Nuclear Information System (INIS)

    Farvacque, M.; Faydide, B.; Parent, M.; Iffenecker, F.; Pentori, B.; Dumas, J.M.

    2000-01-01

    The integration of the CATHARE code in the reactor simulators was completed in the beginning of the years 1990 with the design of the simulators SIPA1 and SIPA2. The SCAR project (Simulator CAthare Release), presented in this paper, is the following of this application. The objective is the adaptation of a reference CATHARE code version to the simulators environment, in order to realize the convergence between the safety analysis tool and the simulator. (A.L.B.)

  4. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  5. MBEToolbox: a Matlab toolbox for sequence data analysis in molecular biology and evolution

    Directory of Open Access Journals (Sweden)

    Xia Xuhua

    2005-03-01

    Full Text Available Abstract Background MATLAB is a high-performance language for technical computing, integrating computation, visualization, and programming in an easy-to-use environment. It has been widely used in many areas, such as mathematics and computation, algorithm development, data acquisition, modeling, simulation, and scientific and engineering graphics. However, few functions are freely available in MATLAB to perform the sequence data analyses specifically required for molecular biology and evolution. Results We have developed a MATLAB toolbox, called MBEToolbox, aimed at filling this gap by offering efficient implementations of the most needed functions in molecular biology and evolution. It can be used to manipulate aligned sequences, calculate evolutionary distances, estimate synonymous and nonsynonymous substitution rates, and infer phylogenetic trees. Moreover, it provides an extensible, functional framework for users with more specialized requirements to explore and analyze aligned nucleotide or protein sequences from an evolutionary perspective. The full functions in the toolbox are accessible through the command-line for seasoned MATLAB users. A graphical user interface, that may be especially useful for non-specialist end users, is also provided. Conclusion MBEToolbox is a useful tool that can aid in the exploration, interpretation and visualization of data in molecular biology and evolution. The software is publicly available at http://web.hku.hk/~jamescai/mbetoolbox/ and http://bioinformatics.org/project/?group_id=454.

  6. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  7. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  8. Structural Time Domain Identification Toolbox User's Guide

    DEFF Research Database (Denmark)

    Andersen, P.; Kirkegaard, Poul Henning; Brincker, Rune

    This manual describes the Structural Time Domain Identification toolbox for use with MA TLAB. This version of the tool box has been developed using the PC-based MA TLAB version 4.2c, but is compatible with prior versions of MATLAB and UNIX-based versions. The routines of the toolbox are the so...

  9. A web-based repository of surgical simulator projects.

    Science.gov (United States)

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  10. Implementation of knowledge management approach in project oriented activities. Experience from simulator upgrade project

    International Nuclear Information System (INIS)

    Pironkov, L.

    2010-01-01

    Project specifics: Replacement of analogue process control system with Ovation®based Distributed Control System; Hybrid solution with simulated I&C logic and stimulated Human Machine Interface; Initial design ‐“turn‐key” project based on standard relationship “single customer –single contractor

  11. M3D project for simulation studies of plasmas

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.; Sugiyama, L.E.

    1998-01-01

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes

  12. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  13. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    Science.gov (United States)

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  14. Exploring International Investment through a Classroom Portfolio Simulation Project

    Science.gov (United States)

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  15. What's in Your Teaching Toolbox?

    Science.gov (United States)

    Mormer, Elaine

    2018-02-01

    Educators are faced with an array of tools available to enhance learning outcomes in the classroom and clinic. These tools range from those that are very simple to those that are sufficiently complex to require an investment in learning time. This article summarizes a collection of teaching tools, ordered by the time involved in learning proficient use. Simple tools described include specific online blogs providing support for faculty and student writing and a simple method to capture and download videos from YouTube for classroom use. More complex tools described include a Web-based application for custom-created animated videos and an interactive audience polling system. Readers are encouraged to reflect on the tools most appropriate for use in their own teaching toolbox by considering the requisite time to proficiency and suitability to address desired learner outcomes.

  16. NEMO: A Stellar Dynamics Toolbox

    Science.gov (United States)

    Barnes, Joshua; Hut, Piet; Teuben, Peter

    2010-10-01

    NEMO is an extendible Stellar Dynamics Toolbox, following an Open-Source Software model. It has various programs to create, integrate, analyze and visualize N-body and SPH like systems, following the pipe and filter architecture. In addition there are various tools to operate on images, tables and orbits, including FITS files to export/import to/from other astronomical data reduction packages. A large growing fraction of NEMO has been contributed by a growing list of authors. The source code consist of a little over 4000 files and a little under 1,000,000 lines of code and documentation, mostly C, and some C++ and Fortran. NEMO development started in 1986 in Princeton (USA) by Barnes, Hut and Teuben. See also ZENO (ascl:1102.027) for the version that Barnes maintains.

  17. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  18. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    Science.gov (United States)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE

  19. The GeantV project: preparing the future of simulation

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R L; Apostolakis, J; Brun, R; Carminati, F; Gheata, A; Nikitina, T; Novak, M; Pokorski, W; Shadura, O; Bandieramonte, M; Bhattacharyya, A; Mohanty, A; Seghal, R; Canal, Ph; Elvira, D; Lima, G; Duhem, L; De Fine Licht, J

    2015-01-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R and D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison. (paper)

  20. Drinking Water Cyanotoxin Risk Communication Toolbox

    Science.gov (United States)

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  1. Angra 1 nuclear power plant full scope simulator development project

    Energy Technology Data Exchange (ETDEWEB)

    Selvatici, Edmundo; Castanheira, Luiz Carlos C.; Silva Junior, Nilo Garcia da, E-mail: edsel@eletronuclear.gov.br, E-mail: lccast@eletronuclear.gov.br, E-mail: nilogar@eletronuclear.gov.br [Eletrobras Termonuclear S.A. (SCO/ELETRONUCLEAR), Angra dos Reis, RJ (Brazil). Superintendencia de Coordenacao da Operacao; Zazo, Francisco Javier Lopez; Ruiz, Jose Antonio, E-mail: jlopez@tecnatom.es, E-mail: jaruiz@tecnatom.es [Tecnatom S.A., San Sebastian de los Reyes, Madrid (Spain)

    2015-07-01

    Specific Full Scope Simulators are an essential tool for training NPP control room operators, in the formation phase as well as for maintaining their qualifications. In the last years availability of a Plant specific simulator has also become a Regulator requirement for Nuclear Power Plant operation. By providing real-time practical training for the operators, the use of a simulator allows improving the operator's performance, reducing the number of unplanned shutdowns and more effective response to abnormal and emergency operating conditions. It can also be used, among other uses, to validate procedures, test proposed plant modifications, perform engineering studies and to provide operation training for the technical support staff of the plant. The NPP site, in Angra dos Reis-RJ, Brazil, comprises the two units in operation, Unit 1, 640 MWe, Westinghouse PWR and Unit 2, 1350 MWe, KWU/Areva PWR and one unit in construction, Unit 3, 1405 MWe, KWU/Areva PWR, of the same design of Angra 2. Angra 2 has had its full scope simulator from the beginning, however this was not the case of Angra 1, that had to train its operators abroad, due to lack of a specific simulator. Eletronuclear participated in all the phases of the project, from data supply to commissioning and validation. The Angra 1 full scope simulator encompasses more than 80 systems of the plant including the Primary system, reactor core and associated auxiliary systems, the secondary system and turbo generator as well as all the Plant operational and safety I and C. The Angra 1 Main Control Room panels were reproduced in the simulator control room as well as the remote shutdown panels that are outside the control room. This paper describes the project for development of the Angra 1 NPP Full Scope Simulator, supplied by Tecnatom S.A., in the period of Feb.2012 to Feb.2015. (author)

  2. Angra 1 nuclear power plant full scope simulator development project

    International Nuclear Information System (INIS)

    Selvatici, Edmundo; Castanheira, Luiz Carlos C.; Silva Junior, Nilo Garcia da

    2015-01-01

    Specific Full Scope Simulators are an essential tool for training NPP control room operators, in the formation phase as well as for maintaining their qualifications. In the last years availability of a Plant specific simulator has also become a Regulator requirement for Nuclear Power Plant operation. By providing real-time practical training for the operators, the use of a simulator allows improving the operator's performance, reducing the number of unplanned shutdowns and more effective response to abnormal and emergency operating conditions. It can also be used, among other uses, to validate procedures, test proposed plant modifications, perform engineering studies and to provide operation training for the technical support staff of the plant. The NPP site, in Angra dos Reis-RJ, Brazil, comprises the two units in operation, Unit 1, 640 MWe, Westinghouse PWR and Unit 2, 1350 MWe, KWU/Areva PWR and one unit in construction, Unit 3, 1405 MWe, KWU/Areva PWR, of the same design of Angra 2. Angra 2 has had its full scope simulator from the beginning, however this was not the case of Angra 1, that had to train its operators abroad, due to lack of a specific simulator. Eletronuclear participated in all the phases of the project, from data supply to commissioning and validation. The Angra 1 full scope simulator encompasses more than 80 systems of the plant including the Primary system, reactor core and associated auxiliary systems, the secondary system and turbo generator as well as all the Plant operational and safety I and C. The Angra 1 Main Control Room panels were reproduced in the simulator control room as well as the remote shutdown panels that are outside the control room. This paper describes the project for development of the Angra 1 NPP Full Scope Simulator, supplied by Tecnatom S.A., in the period of Feb.2012 to Feb.2015. (author)

  3. Robotic inspection technology-process an toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Hermes, Markus [ROSEN Group (United States). R and D Dept.

    2005-07-01

    Pipeline deterioration grows progressively with ultimate aging of pipeline systems (on-plot and cross country). This includes both, very localized corrosion as well as increasing failure probability due to fatigue cracking. Limiting regular inspecting activities to the 'scrapable' part of the pipelines only, will ultimately result into a pipeline system with questionable integrity. The confidence level in the integrity of these systems will drop below acceptance levels. Inspection of presently un-inspectable sections of the pipeline system becomes a must. This paper provides information on ROSEN's progress on the 'robotic inspection technology' project. The robotic inspection concept developed by ROSEN is based on a modular toolbox principle. This is mandatory. A universal 'all purpose' robot would not be reliable and efficient in resolving the postulated inspection task. A preparatory Quality Function Deployment (QFD) analysis is performed prior to the decision about the adequate robotic solution. This enhances the serviceability and efficiency of the provided technology. The word 'robotic' can be understood in its full meaning of Recognition - Strategy - Motion - Control. Cooperation of different individual systems with an established communication, e.g. utilizing Bluetooth technology, support the robustness of the ROSEN robotic inspection approach. Beside the navigation strategy, the inspection strategy is also part of the QFD process. Multiple inspection technologies combined on a single carrier or distributed across interacting container must be selected with a clear vision of the particular goal. (author)

  4. The AGORA High-resolution Galaxy Simulations Comparison Project

    OpenAIRE

    Kim Ji-hoon; Abel Tom; Agertz Oscar; Bryan Greg L.; Ceverino Daniel; Christensen Charlotte; Conroy Charlie; Dekel Avishai; Gnedin Nickolay Y.; Goldbaum Nathan J.; Guedes Javiera; Hahn Oliver; Hobbs Alexander; Hopkins Philip F.; Hummels Cameron B.

    2014-01-01

    The Astrophysical Journal Supplement Series 210.1 (2014): 14 reproduced by permission of the AAS We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ∼100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle o...

  5. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  6. Activity modes selection for project crashing through deterministic simulation

    Directory of Open Access Journals (Sweden)

    Ashok Mohanty

    2011-12-01

    Full Text Available Purpose: The time-cost trade-off problem addressed by CPM-based analytical approaches, assume unlimited resources and the existence of a continuous time-cost function. However, given the discrete nature of most resources, the activities can often be crashed only stepwise. Activity crashing for discrete time-cost function is also known as the activity modes selection problem in the project management. This problem is known to be NP-hard. Sophisticated optimization techniques such as Dynamic Programming, Integer Programming, Genetic Algorithm, Ant Colony Optimization have been used for finding efficient solution to activity modes selection problem. The paper presents a simple method that can provide efficient solution to activity modes selection problem for project crashing.Design/methodology/approach: Simulation based method implemented on electronic spreadsheet to determine activity modes for project crashing. The method is illustrated with the help of an example.Findings: The paper shows that a simple approach based on simple heuristic and deterministic simulation can give good result comparable to sophisticated optimization techniques.Research limitations/implications: The simulation based crashing method presented in this paper is developed to return satisfactory solutions but not necessarily an optimal solution.Practical implications: The use of spreadsheets for solving the Management Science and Operations Research problems make the techniques more accessible to practitioners. Spreadsheets provide a natural interface for model building, are easy to use in terms of inputs, solutions and report generation, and allow users to perform what-if analysis.Originality/value: The paper presents the application of simulation implemented on a spreadsheet to determine efficient solution to discrete time cost tradeoff problem.

  7. Trillo NPP full scope replica simulator project: The last great NPP simulation challenge in Spain

    International Nuclear Information System (INIS)

    Rivero, N.; Abascal, A.

    2006-01-01

    In the year 2000, Trillo NPP (Spanish PWR-KWU design nuclear power plant) and Tecnatom came to the agreement of developing a Trillo plant specific simulator, having as scope all the plant systems operated either from the main control room or from the emergency panels. The simulator operation should be carried out both through a control room replica and graphical user interface, this latter based on plant schematics and softpanels concept. Trillo simulator is to be primarily utilized as a pedagogical tool for the Trillo operational staff training. Because the engineering grade of the mathematical models, it will also have additional uses, such as: - Operation engineering (POE's validation, New Computerized Operator Support Systems Validation, etc).; - Emergency drills; -Plant design modifications assessment. This project has become the largest simulation task Tecnatom has ever undertaken, being structured in three different subprojects, namely: - Simulator manufacture, Simulator acceptance and Training material production. Most relevant technological innovations the project brings are: Highest accuracy in the Nuclear Island models, Advanced Configuration Management System, Open Software architecture, Human machine interface new design, Latest design I/O system and an Instructor Station with extended functionality. The Trillo simulator 'Ready for Training' event is due on September 2003, having started the Factory Acceptance Tests in Autumn 2002. (author)

  8. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    Science.gov (United States)

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  9. 15 MW HArdware-in-the-loop Grid Simulation Project

    Energy Technology Data Exchange (ETDEWEB)

    Rigas, Nikolaos [Clemson Univ., SC (United States); Fox, John Curtiss [Clemson Univ., SC (United States); Collins, Randy [Clemson Univ., SC (United States); Tuten, James [Clemson Univ., SC (United States); Salem, Thomas [Clemson Univ., SC (United States); McKinney, Mark [Clemson Univ., SC (United States); Hadidi, Ramtin [Clemson Univ., SC (United States); Gislason, Benjamin [Clemson Univ., SC (United States); Boessneck, Eric [Clemson Univ., SC (United States); Leonard, Jesse [Clemson Univ., SC (United States)

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  10. Cementitious Barriers Partnership (CBP): Training and Release of CBP Toolbox Software, Version 1.0 - 13480

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, School of Engineering, CRESP, Nashville, TN 37235 (United States); Flach, G.; Langton, C.; Smith, F.G. III; Burns, H. [Savannah River National Laboratory, Aiken, SC 29808 (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy, Dorpsstraat 216, 1721BV Langedijk (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Westerduinweg 3, Petten (Netherlands); Samson, E. [SIMCO Technologies, Inc., Quebec (Canada); Mallick, P.; Suttora, L. [U.S. Department of Energy, Washington, DC (United States); Esh, D.; Fuhrmann, M.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS{sup TM}/ORCHESTRA, STADIUM{sup R}, and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed

  11. Cementitious Barriers Partnership (CBP): Training and Release of CBP Toolbox Software, Version 1.0 - 13480

    International Nuclear Information System (INIS)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S.; Flach, G.; Langton, C.; Smith, F.G. III; Burns, H.; Van der Sloot, H.; Meeussen, J.C.L.; Samson, E.; Mallick, P.; Suttora, L.; Esh, D.; Fuhrmann, M.; Philip, J.

    2013-01-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS TM /ORCHESTRA, STADIUM R , and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS TM /ORCHESTRA and STADIUM R codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed by discussions

  12. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  13. Online model evaluation of large-eddy simulations covering Germany with a horizontal resolution of 156 m

    Science.gov (United States)

    Hansen, Akio; Ament, Felix; Lammert, Andrea

    2017-04-01

    Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.

  14. THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT

    International Nuclear Information System (INIS)

    Kim, Ji-hoon; Conroy, Charlie; Goldbaum, Nathan J.; Krumholz, Mark R.; Abel, Tom; Agertz, Oscar; Gnedin, Nickolay Y.; Kravtsov, Andrey V.; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Hummels, Cameron B.; Dekel, Avishai; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ∼100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ≅ 10 10 , 10 11 , 10 12 , and 10 13 M ☉ at z = 0 and two different ('violent' and 'quiescent') assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy 'metabolism'. The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M

  15. Summer Student Project: GEM Simulation and Gas Mixture Characterization

    CERN Document Server

    Oviedo Perhavec, Juan Felipe

    2013-01-01

    Abstract This project is a numerical simulation approach to Gas Electron Multiplier (GEM) detectors design. GEMs are a type of gaseous ionization detector that have proposed as an upgrade for CMS muon endcap. The main advantages of this technology are high spatial and time resolution and outstanding aging resistance. In this context, fundamental physical behavior of a Gas Electron Multiplier (GEM) is analyzed using ANSYS and Garfield++ software coupling. Essential electron transport properties for several gas mixtures were computed as a function of varying electric and magnetic field using Garfield++ and Magboltz.

  16. Electroacoustical simulation of listening room acoustics for project ARCHIMEDES

    DEFF Research Database (Denmark)

    Bech, Søren

    1989-01-01

    ARCHIMEDES is a psychoacoustics research project, funded under the European EUREKA scheme. Three partners share the work involved: The Acoustics Laboratory of The Technical University of Denmark; Bang and Olufsen of Denmark; and KEF Electronics of England. Its primary object is to quantify...... the influence of listening room acoustics on the timbre of reproduced sound. For simulation of the acoustics of a standard listening room, an electroacoustic setup has been built in an anechoic chamber. The setup is based on a computer model of the listening room, and it consists of a number of loudspeakers...

  17. Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox

    Directory of Open Access Journals (Sweden)

    Andre Santos Ribeiro

    2015-07-01

    Full Text Available Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity.Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI and positron emission tomography (PET. It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also.Results. It was observed both a high inter

  18. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  19. A Total Factor Productivity Toolbox for MATLAB

    NARCIS (Netherlands)

    B.M. Balk (Bert); J. Barbero (Javier); J.L. Zofío (José)

    2018-01-01

    textabstractTotal Factor Productivity Toolbox is a new package for MATLAB that includes functions to calculate the main Total Factor Productivity (TFP) indices and their decompositions, based on Shephard’s distance functions and using Data Envelopment Analysis (DEA) programming techniques. The

  20. The RTMM Toolbox for DMM Applications

    DEFF Research Database (Denmark)

    Sharp, Robin; Todirica, Edward Alexandru

    2002-01-01

    This paper describes an approach to implementing distributed multimedia applications based on the use of a software toolbox. The tools in the box allow the designer to specify which components are to be used, how they are logically connected and what properties the streams of data to be passed...

  1. Accelerator Modeling with MATLAB Accelerator Toolbox

    International Nuclear Information System (INIS)

    2002-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model storage rings and beam transport lines in the MATLAB environment. The objective is to illustrate the flexibility and efficiency of the AT-MATLAB framework. The paper discusses three examples of problems that are analyzed frequently in connection with ring-based synchrotron light sources

  2. Tolkku - a toolbox for decision support from condition monitoring data

    International Nuclear Information System (INIS)

    Saarela, Olli; Lehtonen, Mikko; Halme, Jari; Aikala, Antti; Raivio, Kimmo

    2012-01-01

    This paper describes a software toolbox (a software library) designed for condition monitoring and diagnosis of machines. This toolbox implements both new methods and prior art and is aimed for practical down-to-earth data analysis work. The target is to improve knowledge of the operation and behaviour of machines and processes throughout their entire life-cycles. The toolbox supports different phases of condition based maintenance with tools that extract essential information and automate data processing. The paper discusses principles that have guided toolbox design and the implemented toolbox structure. Case examples are used to illustrate how condition monitoring applications can be built using the toolbox. In the first case study the toolbox is applied to fault detection of industrial centrifuges based on measured electrical current. The second case study outlines an application for centralized monitoring of a fleet of machines that supports organizational learning.

  3. GUMICS4 Synthetic and Dynamic Simulations of the ECLAT Project

    Science.gov (United States)

    Facsko, G.; Palmroth, M. M.; Gordeev, E.; Hakkinen, L. V.; Honkonen, I. J.; Janhunen, P.; Sergeev, V. A.; Kauristie, K.; Milan, S. E.

    2012-12-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range of global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The Cray XT supercomputer of the FMI provides a unique opportunity in global magnetohydrodynamic simulation: running the GUMICS-4 based on one year real solar wind data. Solar wind magnetic field, density, temperature and velocity data based on Advanced Composition Explorer (ACE) and WIND measurements are downloaded from the OMNIWeb open database and a special input file is created for each Cluster orbit. All data gaps are replaced with linear interpolations between the last and first valid data values before and after the data gap. Minimum variance transformation is applied for the Interplanetary Magnetic Field data to clean and avoid the code of divergence. The Cluster orbits are divided into slices allowing parallel computation and each slice has an average tilt angle value. The file timestamps start one hour before the perigee to provide time for building up a magnetosphere in the simulation space. The real

  4. Extreme climate in China. Facts, simulation and projection

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hui-Jun; Sun, Jian-Qi; Chen, Huo-Po; Zhu, Ya-Li; Zhang, Ying; Jiang, Da-Bang; Lang, Xian-Mei; Fan, Ke; Yu, En-Tao [Chinese Academy of Sciences, Beijing (China). Inst. of Atmospheric Physics; Yang, Song [NOAA Climate Prediction Center, Camp Springs, MD (United States)

    2012-06-15

    In this paper, studies on extreme climate in China including extreme temperature and precipitation, dust weather activity, tropical cyclone activity, intense snowfall and cold surge activity, floods, and droughts are reviewed based on the peer-reviewed publications in recent decades. The review is focused first on the climatological features, variability, and trends in the past half century and then on simulations and projections based on global and regional climate models. As the annual mean surface air temperature (SAT) increased throughout China, heat wave intensity and frequency overall increased in the past half century, with a large rate after the 1980s. The daily or yearly minimum SAT increased more significantly than the mean or maximum SAT. The long-term change in precipitation is predominantly characterized by the so-called southern flood and northern drought pattern in eastern China and by the overall increase over Northwest China. The interdecadal variation of monsoon, represented by the monsoon weakening in the end of 1970s, is largely responsible for this change in mean precipitation. Precipitation-related extreme events (e.g., heavy rainfall and intense snowfall) have become more frequent and intense generally over China in the recent years, with large spatial features. Dust weather activity, however, has become less frequent over northern China in the recent years, as result of weakened cold surge activity, reinforced precipitation, and improved vegetation condition. State-of-the-art climate models are capable of reproducing some features of the mean climate and extreme climate events. However, discrepancies among models in simulating and projecting the mean and extreme climate are also demonstrated by many recent studies. Regional models with higher resolutions often perform better than global models. To predict and project climate variations and extremes, many new approaches and schemes based on dynamical models, statistical methods, or their

  5. A fractured rock geophysical toolbox method selection tool

    Science.gov (United States)

    Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.

    2016-01-01

    Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.

  6. SCoT: a Python toolbox for EEG source connectivity.

    Science.gov (United States)

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  7. SCoT: A Python Toolbox for EEG Source Connectivity

    Directory of Open Access Journals (Sweden)

    Martin eBillinger

    2014-03-01

    Full Text Available Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG. Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs require single-trial estimation methods.In this paper, we present SCoT – a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with theMVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting.We demonstrate basic usage of SCoT on motor imagery (MI data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1 brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2 offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  8. Start To End Simulation for the SPARX Project

    CERN Document Server

    Vaccarezza, Cristina; Boscolo, Manuela; Ferrario, Massimo; Fusco, Valeria; Giannessi, Luca; Migliorati, Mauro; Palumbo, Luigi; Quattromini, Marcello; Ronsivalle, Concetta; Serafini, Luca; Spataro, Bruno; Vescovi, Mario

    2005-01-01

    The first phase of the SPARX project now funded by Government Agencies, is an R&D activity focused on developing techniques and critical components for future X-ray facilities. The aim is the generation of electron beams with the ultra-high peak brightness required to drive FEL experiments. The FEL source realization will develop along two lines: (a) the use of the SPARC high brightness photoinjector to test RF compression techniques and the emittance degradation in magnetic compressors due to CSR, (b) the production of radiation in the range of 3-5 nm, both in SASE and SEEDED FEL configurations, in the so called SPARXINO test facility, upgrading the existing Frascati 800 MeV LINAC. In this paper we present and discuss the preliminary start to end simulations results.

  9. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    International Nuclear Information System (INIS)

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE's Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP's charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program

  10. Software development infrastructure for the HYBRID modeling and simulation project

    International Nuclear Information System (INIS)

    Epiney, Aaron S.; Kinoshita, Robert A.; Kim, Jong Suk; Rabiti, Cristian; Greenwood, M. Scott

    2016-01-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  11. Software development infrastructure for the HYBRID modeling and simulation project

    Energy Technology Data Exchange (ETDEWEB)

    Epiney, Aaron S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Greenwood, M. Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  12. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    Science.gov (United States)

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  13. Hanford tank waste operation simulator operational waste volume projection verification and validation procedure

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    The Hanford Tank Waste Operation Simulator is tested to determine if it can replace the FORTRAN-based Operational Waste Volume Projection computer simulation that has traditionally served to project double-shell tank utilization. Three Test Cases are used to compare the results of the two simulators; one incorporates the cleanup schedule of the Tri Party Agreement

  14. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    Science.gov (United States)

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  15. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  16. Feature selection toolbox software package

    Czech Academy of Sciences Publication Activity Database

    Pudil, Pavel; Novovičová, Jana; Somol, Petr

    2002-01-01

    Roč. 23, č. 4 (2002), s. 487-492 ISSN 0167-8655 R&D Projects: GA ČR GA402/01/0981 Institutional research plan: CEZ:AV0Z1075907 Keywords : pattern recognition * feature selection * loating search algorithms Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.409, year: 2002

  17. The laboratory test utilization management toolbox.

    Science.gov (United States)

    Baird, Geoffrey

    2014-01-01

    Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure.

  18. CONTINEX: A Toolbox for Continuation in Experiments

    DEFF Research Database (Denmark)

    Schilder, Frank; Bureau, Emil; Santos, Ilmar

    2014-01-01

    CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation-free mod......CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation......-free models. The current version of the interface for experimental set-ups implements an algorithm for tuning control parameters, a robust noise-tolerant covering algorithm, and functions for monitoring (in)stability. In this talk we will report on experiments with an impact oscillator with magnetic actuators...

  19. Segmentation Toolbox for Tomographic Image Data

    DEFF Research Database (Denmark)

    Einarsdottir, Hildur

    , techniques to automatically analyze such data becomes ever more important. Most segmentation methods for large datasets, such as CT images, deal with simple thresholding techniques, where intensity values cut offs are predetermined and hard coded. For data where the intensity difference is not sufficient......Motivation: Image acquisition has vastly improved over the past years, introducing techniques such as X-ray computed tomography (CT). CT images provide the means to probe a sample non-invasively to investigate its inner structure. Given the wide usage of this technique and massive data amounts......, and partial volume voxels occur frequently, thresholding methods do not suffice and more advanced methods are required. Contribution: To meet these requirements a toolbox has been developed, combining well known methods within the image analysis field. The toolbox includes cluster-based methods...

  20. Insights into the European Years’ Communication Toolboxes

    Directory of Open Access Journals (Sweden)

    Camelia-Mihaela Cmeciu

    2012-08-01

    Full Text Available Since 1983 the European syntagm “unity in diversity” has been implemented in the European Years’ communication campaigns. Dependent on subsidiarity and decentralization, European Years focus on a specific issue which constitutes the subject of a year-long awareness campaign. Beyond the involvement of Europe’s citizens through their local, regional and national authorities in the implementation of the European Years’ policies, there is a unity at the level of the visual communication of the EU by two important image-building elements: EY logos and communication toolboxes. The European Years’ communication toolboxes can be considered signs of inclusion since every organization is expected to customize the templates in the official campaign design of the European Year. The analysis will focus on the image-building elements of three European Years (2010, 2011, 2012. Having social semiotics as the qualitative research method and the analytical framework based on the distinction between design resources and representational resources, I will analyze the double layers of the high intensity point of inclusion: (1 the European Years’ branding process; (2 the visual deontic modality within the visual guidelines of the EY communication toolbox.

  1. HYDRORECESSION: A toolbox for streamflow recession analysis

    Science.gov (United States)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  2. Neuro-QOL and the NIH Toolbox: implications for epilepsy

    Science.gov (United States)

    Nowinski, Cindy J; Victorson, David; Cavazos, Jose E; Gershon, Richard; Cella, David

    2011-01-01

    The impact of neurological disorders on the lives of patients is often far more complex than what is measured in routine examination. Measurement of this impact can be challenging owing to a lack of brief, psychometrically sound and generally accepted instruments. Two NIH-funded initiatives are developing assessment tools, in English and Spanish, which address these issues, and should prove useful to the study and treatment of epilepsy and other neurological conditions. The first, Neuro-QOL, has created a set of health-related quality of life measures that are applicable for people with common neurological disorders. The second, the NIH Toolbox for the Assessment of Neurological and Behavioral Function, is assembling measures of cognitive, emotional, motor and sensory health and function that can be used across all ages, from 3 to 85 years. This article describes both the projects and their potential value to epilepsy treatment and research. PMID:21552344

  3. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    Science.gov (United States)

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  4. Brief introduction to project management of full scope simulator for Qinshan 300 MW Nuclear Power Unit

    International Nuclear Information System (INIS)

    Chen Jie

    1996-01-01

    The key points in development and engineering project management of full scope simulator for Qinshan 300 MW Nuclear Power Unit are briefly introduced. The Gantt chart, some project management methods and experience are presented. The key points analysis along with the project procedure will be useful to the similar project

  5. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    Science.gov (United States)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  6. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    Science.gov (United States)

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  7. PANDA: a pipeline toolbox for analyzing brain diffusion images

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2013-02-01

    Full Text Available Diffusion magnetic resonance imaging (dMRI is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named Pipeline for Analyzing braiN Diffusion imAges (PANDA for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL, Pipeline System for Octave and Matlab (PSOM, Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics (e.g., FA and MD that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI, allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  8. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    Science.gov (United States)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS

  9. Development of an Ontology-Directed Signal Processing Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Stephen W. Lang

    2011-05-27

    This project was focused on the development of tools for the automatic configuration of signal processing systems. The goal is to develop tools that will be useful in a variety of Government and commercial areas and useable by people who are not signal processing experts. In order to get the most benefit from signal processing techniques, deep technical expertise is often required in order to select appropriate algorithms, combine them into a processing chain, and tune algorithm parameters for best performance on a specific problem. Therefore a significant benefit would result from the assembly of a toolbox of processing algorithms that has been selected for their effectiveness in a group of related problem areas, along with the means to allow people who are not signal processing experts to reliably select, combine, and tune these algorithms to solve specific problems. Defining a vocabulary for problem domain experts that is sufficiently expressive to drive the configuration of signal processing functions will allow the expertise of signal processing experts to be captured in rules for automated configuration. In order to test the feasibility of this approach, we addressed a lightning classification problem, which was proposed by DOE as a surrogate for problems encountered in nuclear nonproliferation data processing. We coded a toolbox of low-level signal processing algorithms for extracting features of RF waveforms, and demonstrated a prototype tool for screening data. We showed examples of using the tool for expediting the generation of ground-truth metadata, for training a signal recognizer, and for searching for signals with particular characteristics. The public benefits of this approach, if successful, will accrue to Government and commercial activities that face the same general problem - the development of sensor systems for complex environments. It will enable problem domain experts (e.g. analysts) to construct signal and image processing chains without

  10. The Matlab Radial Basis Function Toolbox

    Directory of Open Access Journals (Sweden)

    Scott A. Sarra

    2017-03-01

    Full Text Available Radial Basis Function (RBF methods are important tools for scattered data interpolation and for the solution of Partial Differential Equations in complexly shaped domains. The most straight forward approach used to evaluate the methods involves solving a linear system which is typically poorly conditioned. The Matlab Radial Basis Function toolbox features a regularization method for the ill-conditioned system, extended precision floating point arithmetic, and symmetry exploitation for the purpose of reducing flop counts of the associated numerical linear algebra algorithms.

  11. Two Mechatronic Projects - an Agricultural Robot and a Flight Simulator Platform

    DEFF Research Database (Denmark)

    Sørensen, Torben; Fan, Zhun; Conrad, Finn

    2005-01-01

    and build in a one year Masters Thesis Project by two M.Sc. students (2400 hours in total). • The development of a portable platform for flight simulation has been initiated in a Mid-term project by two students (720 hours in total). In both of these projects the students started from scratch...

  12. Rad Toolbox User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, Keith F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sjoreen, Andrea L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2013-05-01

    The Radiological Toolbox software developed by Oak Ridge National Laboratory (ORNL) for U. S. Nuclear Regulatory Commission (NRC) is designed to provide electronic access to the vast and varied data that underlies the field of radiation protection. These data represent physical, chemical, anatomical, physiological, and mathematical parameters detailed in various handbooks which a health physicist might consult while in his office. The initial motivation for the software was to serve the needs of the health physicist away from his office and without access to his handbooks; e.g., NRC inspectors. The earlier releases of the software were widely used and accepted around the world by not only practicing health physicist but also those within educational programs. This release updates the software to accommodate changes in Windows operating systems and, in some aspects, radiation protection. This release has been tested on Windows 7 and 8 and on 32- and 64-bit machines. The nuclear decay data has been updated and thermal neutron capture cross sections and cancer risk coefficients have been included. This document and the software’s user’s guide provide further details and documentation of the information captured within the Radiological Toolbox.

  13. MTpy: A Python toolbox for magnetotellurics

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared R.

    2014-11-01

    We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  14. Distributed Aerodynamic Sensing and Processing Toolbox

    Science.gov (United States)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  15. IMPROVING PROJECT SCHEDULE ESTIMATES USING HISTORICAL DATA AND SIMULATION

    Directory of Open Access Journals (Sweden)

    P.H. Meyer

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Many projects are not completed on time or within the original budget. This is caused by uncertainty in project variables as well as the occurrence of risk events. A study was done to determine ways of measuring the risk in development projects executed by a mining company in South Africa. The main objective of the study was to determine whether historical project data would provide a more accurate means of estimating the total project duration. Original estimates and actual completion times for tasks of a number of projects were analysed and compared. The results of the study indicated that a more accurate total duration for a project could be obtained by making use of historical project data. The accuracy of estimates could be improved further by building a comprehensive project schedule database within a specific industry.

    AFRIKAANSE OPSOMMING: Verskeie projekte word nie binne die oorspronklike skedule of begroting voltooi nie. Dit word dikwels veroorsaak deur onsekerheid oor projekveranderlikes en die voorkoms van risiko’s. 'n Studie is gedoen om 'n metode te ontwikkel om risiko te meet vir ontwikkelingsprojekte van 'n mynmaatskappy in Suid Afrika. Die hoofdoel van die studie was om te bepaal of historiese projekdata gebruik kon word om 'n akkurater tydsduur vir 'n projek te beraam. Die geraamde tydsduur van take vir 'n aantal projekte is ontleed en vergelyk met die werklike tydsduur. Die resultate van die studie het getoon dat 'n akkurater totale tydsduur vir die projek verkry kon word deur gebruik te maak van historiese projekdata. Die akkuraatheid kan verder verbeter word deur 'n databasis van projekskedules vir 'n bepaalde industrie te ontwikkel en by datum te hou.

  16. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  17. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Science.gov (United States)

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  18. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    Science.gov (United States)

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  19. Vision and Displays for Military and Security Applications The Advanced Deployable Day/Night Simulation Project

    CERN Document Server

    Niall, Keith K

    2010-01-01

    Vision and Displays for Military and Security Applications presents recent advances in projection technologies and associated simulation technologies for military and security applications. Specifically, this book covers night vision simulation, semi-automated methods in photogrammetry, and the development and evaluation of high-resolution laser projection technologies for simulation. Topics covered include: advances in high-resolution projection, advances in image generation, geographic modeling, and LIDAR imaging, as well as human factors research for daylight simulation and for night vision devices. This title is ideal for optical engineers, simulator users and manufacturers, geomatics specialists, human factors researchers, and for engineers working with high-resolution display systems. It describes leading-edge methods for human factors research, and it describes the manufacture and evaluation of ultra-high resolution displays to provide unprecedented pixel density in visual simulation.

  20. Project management in the library workplace

    CERN Document Server

    Daugherty, Alice

    2018-01-01

    This volume of Advances in Library Administration and Organization attempts to put project management into the toolboxes of library administrators through overviews of concepts, analyses of experiences, and forecasts for the use of project management within the profession.

  1. A Student Project to use Geant4 Simulations for a TMS-PET combination

    Science.gov (United States)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Rueda, A.; Solano Salinas, C. J.; Wahl, D.; Zamudio, A.

    2007-10-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  2. A Student Project to use Geant4 Simulations for a TMS-PET combination

    International Nuclear Information System (INIS)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-01-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing

  3. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  4. Numerical simulation of gasket behaviour during severe accidents (ATHERMIP project)

    International Nuclear Information System (INIS)

    Castro Lopez, Fernando; Orden Martinez, Alfredo

    1998-01-01

    This paper summarises the work carried out to numerically simulate the thermo-mechanical behaviour of sealing gasket in large containment penetrations during a severe accident. The gasket material is an elastomeric material and the thermo-mechanical characterization was based on experimentation. The difficulty of numerical simulation lies in the high non-linearity of the analysis, due on one hand, to the high strain levels reached, and on the other, to stiffness changes introduced by contact/takeoff indicators. Also, the stiffness parameters of the gasket material are not constant, but are subject to changes, both regarding the strain level and the environmental conditions (temperature, radiation). The results obtained allow presenting a calculation model capable of simulating and explaining the behaviour of the sealing gasket during a severe accident. Also, the failure hypothesis numerically obtained was environmentally validated. (author)

  5. Utilization of the simulators in I and C renewal project of Loviisa NPP

    International Nuclear Information System (INIS)

    Porkholm, K.; Ahonen, A.; Tiihonen, O.

    2006-01-01

    There are two VVER-440 type reactors in Loviisa Nuclear Power Plant. The first unit has been in operation since 1977 and the second since 1980. The availability of the plant as well as the operational experiences of the I and C systems are good. However it is obvious that the lifetime of the original I and C systems is not sufficient to guarantee the good availability of the plant in the future. Due to this fact a project for the renewal of the existing I and C systems has been started at Loviisa Nuclear Power Plant. In the project the analogue I and C systems will be renewed by digital I and C systems in four phases during 2005...2014. Simulators will be utilized extensively in the project to assure that the renewal of I and C systems can be realized safely and economically. An engineering simulator will be used in the design and validation of the modifications of the renewal I and C systems. A development simulator is aimed for the design, testing and acceptance of the new Man Machine Interface. A testing simulator will be used for the testing of the new I and C systems and retuning of the controllers mainly during the Factory Acceptance Tests. A training simulator will be used in training the operators and the other technical personnel in the operation of the new monitor-based control room facilities. All the simulators in the renewal project are based on APROS (Advanced PROcess Simulator) Simulation Software. Fortum Nuclear Services Ltd and the Technical Research Centre of Finland have developed APROS Simulation Software since 1986. APROS is a good example of the real multifunctional simulation software; i.e. it can be used in process and automation design, safety analysis and training simulator applications. APROS has been used extensively for various analysis and simulation tasks of the Loviisa Nuclear Power Plant in the past years. It has also been applied to various nuclear and thermal power plants elsewhere. First a short overview of Loviisa Nuclear Power

  6. ObsPy - A Python Toolbox for Seismology - and Applications

    Science.gov (United States)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo

  7. MMM: A toolbox for integrative structure modeling.

    Science.gov (United States)

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  8. Expanding the UniFrac Toolbox.

    Directory of Open Access Journals (Sweden)

    Ruth G Wong

    Full Text Available The UniFrac distance metric is often used to separate groups in microbiome analysis, but requires a constant sequencing depth to work properly. Here we demonstrate that unweighted UniFrac is highly sensitive to rarefaction instance and to sequencing depth in uniform data sets with no clear structure or separation between groups. We show that this arises because of subcompositional effects. We introduce information UniFrac and ratio UniFrac, two new weightings that are not as sensitive to rarefaction and allow greater separation of outliers than classic unweighted and weighted UniFrac. With this expansion of the UniFrac toolbox, we hope to empower researchers to extract more varied information from their data.

  9. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    Science.gov (United States)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  10. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    Science.gov (United States)

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  11. Project Evaluation and Cash Flow Forecasting by Stochastic Simulation

    Directory of Open Access Journals (Sweden)

    Odd A. Asbjørnsen

    1983-10-01

    Full Text Available The net present value of a discounted cash flow is used to evaluate projects. It is shown that the LaPlace transform of the cash flow time function is particularly useful when the cash flow profiles may be approximately described by ordinary linear differential equations in time. However, real cash flows are stochastic variables due to the stochastic nature of the disturbances during production.

  12. Community Project for Accelerator Science and Simulation (ComPASS)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, Christopher [Univ. of Texas, Austin, TX (United States); Carey, Varis [Univ. of Texas, Austin, TX (United States)

    2016-10-12

    After concluding our initial exercise (solving a simplified statistical inverse problem with unknown parameter laser intensity) of coupling Vorpal and our parallel statistical library QUESO, we shifted the application focus to DLA. Our efforts focused on developing a Gaussian process (GP) emulator within QUESO for efficient optimization of power couplers within woodpiles. The smaller simulation size (compared with LPA) allows for sufficient “training runs” to develop a reasonable GP statistical emulator for a parameter space of moderate dimension.

  13. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other

  14. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    Science.gov (United States)

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  15. iamxt: Max-tree toolbox for image processing and analysis

    Directory of Open Access Journals (Sweden)

    Roberto Souza

    2017-01-01

    Full Text Available The iamxt is an array-based max-tree toolbox implemented in Python using the NumPy library for array processing. It has state of the art methods for building and processing the max-tree, and a large set of visualization tools that allow to view the tree and the contents of its nodes. The array-based programming style and max-tree representation used in the toolbox make it simple to use. The intended audience of this toolbox includes mathematical morphology students and researchers that want to develop research in the field and image processing researchers that need a toolbox simple to use and easy to integrate in their applications.

  16. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  17. Simulation of slag control for the Plasma Hearth Project

    International Nuclear Information System (INIS)

    Power, M.A.; Carney, K.P.; Peters. G.G.

    1996-01-01

    The goal of the Plasma Hearth Project is to stabilize alpha-emitting radionuclides in a vitreous slag and to reduce the effective storage volume of actinide-containing waste for long-term burial. The actinides have been shown to partition into the vitreous slag phase of the melt. The slag composition may be changed by adding glass-former elements to ensure that this removable slag has the most desired physical and chemical properties for long-term burial. A data acquisition and control system has been designed to regulate the composition of five elements in the slag

  18. Human eye modelling for ophthalmic simulators project for clinic applications

    International Nuclear Information System (INIS)

    Sanchez, Andrea; Santos, Adimir dos; Yoriyaz, Helio

    2002-01-01

    Most of eye tumors are treated by surgical means, which involves the enucleation of affected eyes. In terms of treatment and control of diseases, there is brachytherapy, which often utilizes small applicator of Co-60, I-125, Ru-106, Ir-192, etc. These methods are shown to be very efficient but highly cost. The objective of this work is propose a detailed simulator modelling for eye characterization. Additionally, this study can contribute to design and build a new applicator in order to reduce the cost and to allow more patients to be treated

  19. III. NIH TOOLBOX COGNITION BATTERY (CB): MEASURING EPISODIC MEMORY

    OpenAIRE

    Bauer, Patricia J.; Dikmen, Sureyya S.; Heaton, Robert K.; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L.

    2013-01-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are adminis...

  20. Simulated Annealing Genetic Algorithm Based Schedule Risk Management of IT Outsourcing Project

    Directory of Open Access Journals (Sweden)

    Fuqiang Lu

    2017-01-01

    Full Text Available IT outsourcing is an effective way to enhance the core competitiveness for many enterprises. But the schedule risk of IT outsourcing project may cause enormous economic loss to enterprise. In this paper, the Distributed Decision Making (DDM theory and the principal-agent theory are used to build a model for schedule risk management of IT outsourcing project. In addition, a hybrid algorithm combining simulated annealing (SA and genetic algorithm (GA is designed, namely, simulated annealing genetic algorithm (SAGA. The effect of the proposed model on the schedule risk management problem is analyzed in the simulation experiment. Meanwhile, the simulation results of the three algorithms GA, SA, and SAGA show that SAGA is the most superior one to the other two algorithms in terms of stability and convergence. Consequently, this paper provides the scientific quantitative proposal for the decision maker who needs to manage the schedule risk of IT outsourcing project.

  1. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  2. ObsPy: A Python Toolbox for Seismology

    Science.gov (United States)

    Krischer, Lion; Megies, Tobias; Sales de Andrade, Elliott; Barsch, Robert; MacCarthy, Jonathan

    2017-04-01

    In recent years the Python ecosystem evolved into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community-driven, open-source project dedicated to providing a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than seven years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases. Additionally we will discuss the road ahead as well as the long-term sustainability of open-source scientific software.

  3. A Statistical Toolbox For Mining And Modeling Spatial Data

    Directory of Open Access Journals (Sweden)

    D’Aubigny Gérard

    2016-12-01

    Full Text Available Most data mining projects in spatial economics start with an evaluation of a set of attribute variables on a sample of spatial entities, looking for the existence and strength of spatial autocorrelation, based on the Moran’s and the Geary’s coefficients, the adequacy of which is rarely challenged, despite the fact that when reporting on their properties, many users seem likely to make mistakes and to foster confusion. My paper begins by a critical appraisal of the classical definition and rational of these indices. I argue that while intuitively founded, they are plagued by an inconsistency in their conception. Then, I propose a principled small change leading to corrected spatial autocorrelation coefficients, which strongly simplifies their relationship, and opens the way to an augmented toolbox of statistical methods of dimension reduction and data visualization, also useful for modeling purposes. A second section presents a formal framework, adapted from recent work in statistical learning, which gives theoretical support to our definition of corrected spatial autocorrelation coefficients. More specifically, the multivariate data mining methods presented here, are easily implementable on the existing (free software, yield methods useful to exploit the proposed corrections in spatial data analysis practice, and, from a mathematical point of view, whose asymptotic behavior, already studied in a series of papers by Belkin & Niyogi, suggests that they own qualities of robustness and a limited sensitivity to the Modifiable Areal Unit Problem (MAUP, valuable in exploratory spatial data analysis.

  4. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    Science.gov (United States)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to

  5. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, III, F. G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM@ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased level of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM@ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM@ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that

  6. a Matlab Toolbox for Basin Scale Fluid Flow Modeling Applied to Hydrology and Geothermal Energy

    Science.gov (United States)

    Alcanie, M.; Lupi, M.; Carrier, A.

    2017-12-01

    Recent boosts in the development of geothermal energy were fostered by the latest oil crises and by the need of reducing CO2 emissions generated by the combustion of fossil fuels. Various numerical codes (e.g. FEHM, CSMP++, HYDROTHERM, TOUGH) have thus been implemented for the simulation and quantification of fluid flow in the upper crust. One possible limitation of such codes is the limited accessibility and the complex structure of the simulators. For this reason, we began to develop a Hydrothermal Fluid Flow Matlab library as part of MRST (Matlab Reservoir Simulation Toolbox). MRST is designed for the simulation of oil and gas problems including carbon capture storage. However, a geothermal module is still missing. We selected the Geneva Basin as a natural laboratory because of the large amount of data available in the region. The Geneva Basin has been intensely investigated in the past with exploration wells, active seismic and gravity surveys. In addition, the energy strategy of Switzerland promotes the development of geothermal energy that lead to recent geophysical prospections. Previous and ongoing projects have shown the geothermal potential of the Geneva Basin but a consistent fluid flow model assessing the deep circulation in the region is yet to be defined. The first step of the study was to create the basin-scale static model. We integrated available active seismic, gravity inversions and borehole data to describe the principal geologic and tectonic features of the Geneva Basin. Petrophysical parameters were obtained from available and widespread well logs. This required adapting MRST to standard text format file imports and outline a new methodology for quick static model creation in an open source environment. We implemented several basin-scale fluid flow models to test the effects of petrophysical properties on the circulation dynamics of deep fluids in the Geneva Basin. Preliminary results allow the identification of preferential fluid flow

  7. Large-Eddy Simulation Using Projection onto Local Basis Functions

    Science.gov (United States)

    Pope, S. B.

    In the traditional approach to LES for inhomogeneous flows, the resolved fields are obtained by a filtering operation (with filter width Delta). The equations governing the resolved fields are then partial differential equations, which are solved numerically (on a grid of spacing h). For an LES computation of a given magnitude (i.e., given h), there are conflicting considerations in the choice of Delta: to resolve a large range of turbulent motions, Delta should be small; to solve the equations with numerical accuracy, Delta should be large. In the alternative approach advanced here, this conflict is avoided. The resolved fields are defined by projection onto local basis functions, so that the governing equations are ordinary differential equations for the evolution of the basis-function coefficients. There is no issue of numerical spatial discretization errors. A general methodology for modelling the effects of the residual motions is developed. The model is based directly on the basis-function coefficients, and its effect is to smooth the fields where their rates of change are not well resolved by the basis functions. Demonstration calculations are performed for Burgers' equation.

  8. Simulation-based valuation of project finance investments. Crucial aspects of power plant projects

    Energy Technology Data Exchange (ETDEWEB)

    Pietz, Matthaeus

    2010-12-15

    The liberalization of electricity markets transformed a regulated and stable market to a market with former unknown price volatility. This results in a high uncertainty which is mainly due to the, from an economic point of view, lack of storability of the commodity electricity. Thus investments in power plants are highly risky. This dissertation analyzes crucial aspects within the valuation of a power plant financed via project finance, a popular financing method for projects with high capital requirements. Starting with the development of a valuation model based on stochastic modelling of the future cash flows the focus of the analysis is on the impact of model complexity and electricity prices. (orig.)

  9. CERN Summer Student Project Report – Simulation of the Micromegas Detector

    CERN Document Server

    Soares Ferreira Nunes Teixeira, Sofia Luisa

    2015-01-01

    My project during the Summer Student Programme at CERN consisted on simulations of the Micromegas (MM) detectors in order to test and characterize them in the presence of contamination by air of the gas mixture. The MM detectors were chosen for the upcoming upgrade of the ATLAS detector. The motivation for this project and the results obtained are here presented. Moreover, the work that should be carried out after this programme as a continuation of this project is also referred. To conclude, final considerations about the project are presented.

  10. Contribution of Full-Scope Simulator Development Project to the Dissemination of Nuclear Knowledge within New-Build–Project Teams

    International Nuclear Information System (INIS)

    Gain, P.

    2016-01-01

    Full text: In a context where few countries recently carried out nuclear new-build projects combined with very strong need for generation renewal, there exists a major stake for the training of the hundreds of engineers who are involved in the design and commissioning teams of this highly complex industrial facility. The Simulator project, which gives the first opportunity for integration and validation of the whole of the design data, to check their coherence, the good performance with the interface and conformity with the safety and performance requirements, allows a fast and effective competence rise of all the resources involved in its development. In addition, the phased availability of the whole of data generally results in having several phased versions of the simulator. Each can then be deployed in great number for training drills which also will contribute to share in optimal way knowledge on the reference plant design. This contexts of broader use of these training modules and use of simulation in support of the engineering activities lead to their use by many teams with varied profiles; and there too, the simulation technologies are of a remarkable effectiveness to share a common and stable knowledge management. (author

  11. Design of a Toolbox of RNA Thermometers.

    Science.gov (United States)

    Sen, Shaunak; Apurva, Divyansh; Satija, Rohit; Siegal, Dan; Murray, Richard M

    2017-08-18

    Biomolecular temperature sensors can be used for efficient control of large-volume bioreactors, for spatiotemporal imaging and control of gene expression, and to engineer robustness to temperature in biomolecular circuit design. Although RNA-based sensors, called "thermometers", have been investigated in both natural and synthetic contexts, an important challenge is to design diverse responses to temperature differing in sensitivity and threshold. We address this issue by constructing a library of RNA thermometers based on thermodynamic computations and experimentally measuring their activities in cell-free biomolecular "breadboards". Using free energies of the minimum free energy structures as well as melt profile computations, we estimated that a diverse set of temperature responses were possible. We experimentally found a wide range of responses to temperature in the range 29-37 °C with fold-changes varying over 3-fold around the starting thermometer. The sensitivities of these responses ranged over 10-fold around the starting thermometer. We correlated these measurements with computational expectations, finding that although there was no strong correlation for the individual thermometers, overall trends of diversity, fold-changes, and sensitivities were similar. These results present a toolbox of RNA-based circuit elements with diverse temperature responses.

  12. Risk Consideration and Cost Estimation in Construction Projects Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Claudius A. Peleskei

    2015-06-01

    Full Text Available Construction projects usually involve high investments. It is, therefore, a risky adventure for companies as actual costs of construction projects nearly always exceed the planed scenario. This is due to the various risks and the large uncertainty existing within this industry. Determination and quantification of risks and their impact on project costs within the construction industry is described to be one of the most difficult areas. This paper analyses how the cost of construction projects can be estimated using Monte Carlo Simulation. It investigates if the different cost elements in a construction project follow a specific probability distribution. The research examines the effect of correlation between different project costs on the result of the Monte Carlo Simulation. The paper finds out that Monte Carlo Simulation can be a helpful tool for risk managers and can be used for cost estimation of construction projects. The research has shown that cost distributions are positively skewed and cost elements seem to have some interdependent relationships.

  13. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    Science.gov (United States)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  14. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe s how students use a project management simulation game based on an attack‑defense mechanism where two teams of players compete by challenging each other⠒s projects. The project management simulation game is intended to be playe d by pre‑service construction workers and e...

  15. A simulation study on the focal plane detector of the LAUE project

    Science.gov (United States)

    Khalil, M.; Frontera, F.; Caroli, E.; Virgilli, E.; Valsan, V.

    2015-06-01

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium).

  16. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  17. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    Energy Technology Data Exchange (ETDEWEB)

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  18. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  19. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    Science.gov (United States)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  20. Risk Assessment in Financial Feasibility of Tanker Project Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Muhammad Badrus Zaman

    2017-09-01

    Full Text Available Every ship project would not be apart from risk and uncertainty issues. The inappropriate risk assessment process would have long-term impact, such as financial loss. Thus, risk and uncertainties analysis would be a very important process in financial feasibility determination of the project. This study analyzes the financial feasibility of 17,500 LTDW tanker project. Risk and uncertainty are two differentiated terminologies in this study, where risk focuses on operational risk due to shipbuilding process nonconformity to shipowner finance, while uncertainty focuses on variable costs that affect project cash flows. There are three funding scenarios in this study, where the percentage of funding with own capital and bank loan in scenario 1 is 100% : 0%, scenario 2 is 75% : 25%, and scenario 3 is 50% : 50%. Monte Carlo simulation method was applied to simulate the acceptance criteria, such as net present value (NPV, internal rate of return (IRR, payback period (PP, and profitability index (PI. The results of simulation show that 17,500 LTDW tanker project funding by scenario 1, 2 and 3 are feasible to run, where probability of each acceptance criteria was greater than 50%. Charter rate being the most sensitive uncertainty over project's financial feasibility parameters.

  1. Collaborative Management of Complex Major Construction Projects: AnyLogic-Based Simulation Modelling

    Directory of Open Access Journals (Sweden)

    Na Zhao

    2016-01-01

    Full Text Available Complex supply chain system collaborative management of major construction projects effectively integrates the different participants in the construction project. This paper establishes a simulation model based on AnyLogic to reveal the collaborative elements in the complex supply chain management system and the modes of action as well as the transmission problems of the intent information. Thus it is promoting the participants to become an organism with coordinated development and coevolution. This study can help improve the efficiency and management of the complex system of major construction projects.

  2. Open Babel: An open chemical toolbox

    Directory of Open Access Journals (Sweden)

    O'Boyle Noel M

    2011-10-01

    Full Text Available Abstract Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language and de facto standards have arisen (for example, SMILES format, the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example, and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org.

  3. Open Babel: An open chemical toolbox

    Science.gov (United States)

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  4. FAST: FAST Analysis of Sequences Toolbox

    Directory of Open Access Journals (Sweden)

    Travis J. Lawrence

    2015-05-01

    Full Text Available FAST (FAST Analysis of Sequences Toolbox provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU’s Not Unix Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics makes FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format. Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought.

  5. Projections of Future Precipitation Extremes Over Europe: A Multimodel Assessment of Climate Simulations

    Science.gov (United States)

    Rajczak, Jan; Schär, Christoph

    2017-10-01

    Projections of precipitation and its extremes over the European continent are analyzed in an extensive multimodel ensemble of 12 and 50 km resolution EURO-CORDEX Regional Climate Models (RCMs) forced by RCP2.6, RCP4.5, and RCP8.5 (Representative Concentration Pathway) aerosol and greenhouse gas emission scenarios. A systematic intercomparison with ENSEMBLES RCMs is carried out, such that in total information is provided for an unprecedentedly large data set of 100 RCM simulations. An evaluation finds very reasonable skill for the EURO-CORDEX models in simulating temporal and geographical variations of (mean and heavy) precipitation at both horizontal resolutions. Heavy and extreme precipitation events are projected to intensify across most of Europe throughout the whole year. All considered models agree on a distinct intensification of extremes by often more than +20% in winter and fall and over central and northern Europe. A reduction of rainy days and mean precipitation in summer is simulated by a large majority of models in the Mediterranean area, but intermodel spread between the simulations is large. In central Europe and France during summer, models project decreases in precipitation but more intense heavy and extreme rainfalls. Comparison to previous RCM projections from ENSEMBLES reveals consistency but slight differences in summer, where reductions in southern European precipitation are not as pronounced as previously projected. The projected changes of the European hydrological cycle may have substantial impact on environmental and anthropogenic systems. In particular, the simulations indicate a rising probability of summertime drought in southern Europe and more frequent and intense heavy rainfall across all of Europe.

  6. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    International Nuclear Information System (INIS)

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD ampersand E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD ampersand E simulant-development task is described in this document. The key FY95 accomplishments of the RPD ampersand E simulant-development task are summarized below

  7. Modeling and Simulation of Project Management through the PMBOK® Standard Using Complex Networks

    Directory of Open Access Journals (Sweden)

    Luz Stella Cardona-Meza

    2017-01-01

    Full Text Available Discussion about project management, in both the academic literature and industry, is predominantly based on theories of control, many of which have been developed since the 1950s. However, issues arise when these ideas are applied unilaterally to all types of projects and in all contexts. In complex environments, management problems arise from assuming that results, predicted at the start of a project, can be sufficiently described and delivered as planned. Thus, once a project reaches a critical size, a calendar, and a certain level of ambiguity and interconnection, the analysis centered on control does not function adequately. Projects that involve complex situations can be described as adaptive complex systems, consistent in multiple interdependent dynamic components, multiple feedback processes, nonlinear relations, and management of hard data (process dynamics and soft data (executive team dynamics. In this study, through a complex network, the dynamic structure of a project and its trajectories are simulated using inference processes. Finally, some numerical simulations are described, leading to a decision making tool that identifies critical processes, thereby obtaining better performance outcomes of projects.

  8. Wheat Rust Toolbox Related to New Initiatives on Yellow Rust

    DEFF Research Database (Denmark)

    Hansen, Jens Grønbech; Lassen, Poul

    ://www.fao.org/agriculture/crops/rust/stem/rust-report/en/). The Wheat rust toolbox is one of several International research platforms hosted by Aarhus University, and it uses the same ICT framework and databases as EuroWheat (www.eurowheat.org) and EuroBlight (www.EuroBlight.net). The Wheat Rust Toolbox will also serve the Global Rust Reference Centre (GRRC) as well...... – 2009), and as soon as possible this will be expanded to cover all global yellow rust data available via the GRRC. The presentation will focus on experiences from the previous work on global databases and web based information systems, as well as propose ideas how the toolbox can be helpful regarding...

  9. project SENSE : multimodal simulation with full-body real-time verbal and nonverbal interactions

    NARCIS (Netherlands)

    Miri, Hossein; Kolkmeier, Jan; Taylor, Paul Jonathon; Poppe, Ronald; Heylen, Dirk; Poppe, Ronald; Meyer, John-Jules; Veltkamp, Remco; Dastani, Mehdi

    2016-01-01

    This paper presents a multimodal simulation system, project-SENSE, that combines virtual reality and full-body motion capture technologies with real-time verbal and nonverbal communication. We introduce the technical setup and employed hardware and software of a first prototype. We discuss the

  10. Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to the Enterprise

    Science.gov (United States)

    2010-04-29

    Technology: From the Office Larry Smith Software Technology Support Center to the Enterprise 517 SMXS/MXDEA 6022 Fir Avenue Hill AFB, UT 84056 801...2010 to 00-00-2010 4. TITLE AND SUBTITLE Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to

  11. Final results of the supra project : Improved Simulation of Upset Recovery

    NARCIS (Netherlands)

    Fucke, L.; Groen, E.; Goman, M.; Abramov, N.; Wentink, M.; Nooij, S.; Zaichik, L.E.; Khrabrov, A.

    2012-01-01

    The objective of the European research project SUPRA (Simulation of Upset Recovery in Aviation) is to develop technologies that eventually contribute to a reduction of risk of Loss of control - in flight (LOC-I) accidents, today's major cause of fatal accidents in commercial aviation. To this end

  12. BlueSky ATC Simulator Project : An Open Data and Open Source Approach

    NARCIS (Netherlands)

    Hoekstra, J.M.; Ellerbroek, J.

    2016-01-01

    To advance ATM research as a science, ATM research results should be made more comparable. A possible way to do this is to share tools and data. This paper presents a project that investigates the feasibility of a fully open-source and open-data approach to air traffic simulation. Here, the first of

  13. Solution and implementation of project ''Simulator of WWER-440 nuclear power plant''

    International Nuclear Information System (INIS)

    Koukal, V.

    1985-01-01

    The time data are given of the development and construction of the simulator of a WWER-440 nuclear power plant unit. The individual tasks are summed up which are related to the project implementation, and cooperating institutions and enterprises are listed. (J.C.)

  14. Baseline process description for simulating plutonium oxide production for precalc project

    Energy Technology Data Exchange (ETDEWEB)

    Pike, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-26

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as well as process and facility design details necessary for multi-scale, multi-physics models are provided.

  15. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    Science.gov (United States)

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  16. Process, cost modeling and simulations for integrated project development of biomass for fuel and protein

    International Nuclear Information System (INIS)

    Pannir Selvam, P.V.; Wolff, D.M.B.; Souza Melo, H.N.

    1998-01-01

    The construction of the models for biomass project development are described. These models, first constructed using QPRO electronic spread sheet for Windows, are now being developed with the aid of visual and object oriented program as tools using DELPHI V.1 for windows and process simulator SUPERPRO, V.2.7 Intelligent Inc. These models render the process development problems with economic objectives to be solved very rapidly. The preliminary analysis of cost and investments of biomass utilisation projects which are included for this study are: steam, ammonia, carbon dioxide and alkali pretreatment process, methane gas production using anaerobic digestion process, aerobic composting, ethanol fermentation and distillation, effluent treatments using high rate algae production as well as cogeneration of energy for drying. The main project under developments are the biomass valuation projects with the elephant (Napier) grass, sugar cane bagasse and microalgae, using models for mass balance, equipment and production cost. The sensibility analyses are carried out to account for stochastic variation of the process yield, production volume, price variations, using Monte Carlo method. These models allow the identification of economical and scale up problems of the technology. The results obtained with few preliminary project development with few case studies are reported for integrated project development for fuel and protein using process and cost simulation models. (author)

  17. MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.

    Science.gov (United States)

    McNair, Nicolas A

    2017-01-30

    To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. NATbox: a network analysis toolbox in R.

    Science.gov (United States)

    Chavan, Shweta S; Bauer, Michael A; Scutari, Marco; Nagarajan, Radhakrishnan

    2009-10-08

    There has been recent interest in capturing the functional relationships (FRs) from high-throughput assays using suitable computational techniques. FRs elucidate the working of genes in concert as a system as opposed to independent entities hence may provide preliminary insights into biological pathways and signalling mechanisms. Bayesian structure learning (BSL) techniques and its extensions have been used successfully for modelling FRs from expression profiles. Such techniques are especially useful in discovering undocumented FRs, investigating non-canonical signalling mechanisms and cross-talk between pathways. The objective of the present study is to develop a graphical user interface (GUI), NATbox: Network Analysis Toolbox in the language R that houses a battery of BSL algorithms in conjunction with suitable statistical tools for modelling FRs in the form of acyclic networks from gene expression profiles and their subsequent analysis. NATbox is a menu-driven open-source GUI implemented in the R statistical language for modelling and analysis of FRs from gene expression profiles. It provides options to (i) impute missing observations in the given data (ii) model FRs and network structure from gene expression profiles using a battery of BSL algorithms and identify robust dependencies using a bootstrap procedure, (iii) present the FRs in the form of acyclic graphs for visualization and investigate its topological properties using network analysis metrics, (iv) retrieve FRs of interest from published literature. Subsequently, use these FRs as structural priors in BSL (v) enhance scalability of BSL across high-dimensional data by parallelizing the bootstrap routines. NATbox provides a menu-driven GUI for modelling and analysis of FRs from gene expression profiles. By incorporating readily available functions from existing R-packages, it minimizes redundancy and improves reproducibility, transparency and sustainability, characteristic of open-source environments

  19. ECO INVESTMENT PROJECT MANAGEMENT THROUGH TIME APPLYING ARTIFICIAL NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    Tamara Gvozdenović

    2007-06-01

    Full Text Available he concept of project management expresses an indispensable approach to investment projects. Time is often the most important factor in these projects. The artificial neural network is the paradigm of data processing, which is inspired by the one used by the biological brain, and it is used in numerous, different fields, among which is the project management. This research is oriented to application of artificial neural networks in managing time of investment project. The artificial neural networks are used to define the optimistic, the most probable and the pessimistic time in PERT method. The program package Matlab: Neural Network Toolbox is used in data simulation. The feed-forward back propagation network is chosen.

  20. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Science.gov (United States)

    Swales, Dustin J.; Pincus, Robert; Bodas-Salcedo, Alejandro

    2018-01-01

    The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP) gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  1. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Directory of Open Access Journals (Sweden)

    D. J. Swales

    2018-01-01

    Full Text Available The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  2. Automated simulation and study of spatial-structural design processes

    NARCIS (Netherlands)

    Davila Delgado, J.M.; Hofmeyer, H.; Stouffs, R.; Sariyildiz, S.

    2013-01-01

    A so-called "Design Process Investigation toolbox" (DPI toolbox), has been developed. It is a set of computational tools that simulate spatial-structural design processes. Its objectives are to study spatial-structural design processes and to support the involved actors. Two case-studies are

  3. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  4. An optimization algorithm for simulation-based planning of low-income housing projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2010-10-01

    Full Text Available Construction of low-income housing projects is a replicated process and is associated with uncertainties that arise from the unavailability of resources. Government agencies and/or contractors have to select a construction system that meets low-income housing projects constraints including project conditions, technical, financial and time constraints. This research presents a framework, using computer simulation, which aids government authorities and contractors in the planning of low-income housing projects. The proposed framework estimates the time and cost required for the construction of low-income housing using pre-cast hollow core with hollow blocks bearing walls. Five main components constitute the proposed framework: a network builder module, a construction alternative selection module, a simulation module, an optimization module and a reporting module. An optimization module utilizing a genetic algorithm enables the defining of different options and ranges of parameters associated with low-income housing projects that influence the duration and total cost of the pre-cast hollow core with hollow blocks bearing walls method. A computer prototype, named LIHouse_Sim, was developed in MS Visual Basic 6.0 as proof of concept for the proposed framework. A numerical example is presented to demonstrate the use of the developed framework and to illustrate its essential features.

  5. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  6. Versatile Cas9-driven subpopulation selection toolbox for Lactococcus lactis

    NARCIS (Netherlands)

    Els, van der Simon; James, Jennelle K.; Kleerebezem, Michiel; Bron, Peter A.

    2018-01-01

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis. The Cas9 expression vector

  7. 40 CFR 141.719 - Additional filtration toolbox components.

    Science.gov (United States)

    2010-07-01

    ... taken from a surface water or GWUDI source. A cap, such as GAC, on a single stage of filtration is not... separate stage of filtration if both filtration stages treat entire plant flow taken from a surface water... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Additional filtration toolbox...

  8. Geoplotlib: a Python Toolbox for Visualizing Geographical Data

    OpenAIRE

    Cuttone, Andrea; Lehmann, Sune; Larsen, Jakob Eg

    2016-01-01

    We introduce geoplotlib, an open-source python toolbox for visualizing geographical data. geoplotlib supports the development of hardware-accelerated interactive visualizations in pure python, and provides implementations of dot maps, kernel density estimation, spatial graphs, Voronoi tesselation, shapefiles and many more common spatial visualizations. We describe geoplotlib design, functionalities and use cases.

  9. Testing adaptive toolbox models: a Bayesian hierarchical approach

    NARCIS (Netherlands)

    Scheibehenne, B.; Rieskamp, J.; Wagenmakers, E.-J.

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often

  10. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    1997-01-01

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  11. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  12. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  13. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    CERN Document Server

    Abbrescia, M; Aiola, S; Antolini, R; Avanzini, C; Baldini Ferroli, R; Bencivenni, G; Bossini, E; Bressan, E; Chiavassa, A; Cicalò, C; Cifarelli, L; Coccia, E; De Gruttola, D; De Pasquale, S; Di Giovanni, A; D'Incecco, M; Dreucci, M; Fabbri, F L; Frolov, V; Garbini, M; Gemme, G; Gnesi, I; Gustavino, C; Hatzifotiadou, D; La Rocca, P; Li, S; Librizzi, F; Maggiora, A; Massai, M; Miozzi, S; Panareo, M; Paoletti, R; Perasso, L; Pilo, F; Piragino, G; Regano, A; Riggi, F; Righini, G C; Sartorelli, G; Scapparone, E; Scribano, A; Selvi, M; Serci, S; Siddi, E; Spandre, G; Squarcia, S; Taiuti, M; Tosello, F; Votano, L; Williams, M C S; Yánez, G; Zichichi, A; Zuyeuski, R

    2014-01-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 10 5 km 2 . In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  14. Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation

    Directory of Open Access Journals (Sweden)

    Kamran Shahanaghi

    2012-01-01

    Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.

  15. 3D numerical simulation of projection welding of square nuts to sheets

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Zhang, W.; Martins, P. A. F.

    2015-01-01

    formulation inorder to model the frictional sliding between the square nut projections and the sheets during the weld-ing process. It is proved that the implementation of friction increases the accuracy of the simulations,and the dynamic influence of friction on the process is explained.© 2014 Elsevier B......The challenge of developing a three-dimensional finite element computer program for electro-thermo-mechanical industrial modeling of resistance welding is presented, and the program is applied to thesimulation of projection welding of square nuts to sheets. Results are compared with experimental...

  16. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    Science.gov (United States)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  17. Evolution and experience with the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00389536; The ATLAS collaboration; Brasolin, Franco; Kouba, Tomas; Schovancova, Jaroslava; Fazio, Daniel; Di Girolamo, Alessandro; Scannicchio, Diana; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander; Lee, Christopher

    2017-01-01

    The Simulation at Point1 project is successfully running standard ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We present our experience with using the Event Service that provides the event-level granularity of computations. We show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources is also presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  18. Evolution and experience with the ATLAS simulation at Point1 project

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Fazio, Daniel; Di Girolamo, Alessandro; Kouba, Tomas; Lee, Christopher; Scannicchio, Diana; Schovancova, Jaroslava; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2016-01-01

    The Simulation at Point1 project is successfully running traditional ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We will present our experience with using the Event Service that provides the event-level granularity of computations. We will show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources will also be presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  19. DeltaProt: a software toolbox for comparative genomics

    Directory of Open Access Journals (Sweden)

    Willassen Nils P

    2010-11-01

    Full Text Available Abstract Background Statistical bioinformatics is the study of biological data sets obtained by new micro-technologies by means of proper statistical methods. For a better understanding of environmental adaptations of proteins, orthologous sequences from different habitats may be explored and compared. The main goal of the DeltaProt Toolbox is to provide users with important functionality that is needed for comparative screening and studies of extremophile proteins and protein classes. Visualization of the data sets is also the focus of this article, since visualizations can play a key role in making the various relationships transparent. This application paper is intended to inform the reader of the existence, functionality, and applicability of the toolbox. Results We present the DeltaProt Toolbox, a software toolbox that may be useful in importing, analyzing and visualizing data from multiple alignments of proteins. The toolbox has been written in MATLAB™ to provide an easy and user-friendly platform, including a graphical user interface, while ensuring good numerical performance. Problems in genome biology may be easily stated thanks to a compact input format. The toolbox also offers the possibility of utilizing structural information from the SABLE or other structure predictors. Different sequence plots can then be viewed and compared in order to find their similarities and differences. Detailed statistics are also calculated during the procedure. Conclusions The DeltaProt package is open source and freely available for academic, non-commercial use. The latest version of DeltaProt can be obtained from http://services.cbu.uib.no/software/deltaprot/. The website also contains documentation, and the toolbox comes with real data sets that are intended for training in applying the models to carry out bioinformatical and statistical analyses of protein sequences. Equipped with the new algorithms proposed here, DeltaProt serves as an auxiliary

  20. Simulated bat populations erode when exposed to climate change projections for western North America.

    Directory of Open Access Journals (Sweden)

    Mark A Hayes

    Full Text Available Recent research has demonstrated that temperature and precipitation conditions correlate with successful reproduction in some insectivorous bat species that live in arid and semiarid regions, and that hot and dry conditions correlate with reduced lactation and reproductive output by females of some species. However, the potential long-term impacts of climate-induced reproductive declines on bat populations in western North America are not well understood. We combined results from long-term field monitoring and experiments in our study area with information on vital rates to develop stochastic age-structured population dynamics models and analyzed how simulated fringed myotis (Myotis thysanodes populations changed under projected future climate conditions in our study area near Boulder, Colorado (Boulder Models and throughout western North America (General Models. Each simulation consisted of an initial population of 2,000 females and an approximately stable age distribution at the beginning of the simulation. We allowed each population to be influenced by the mean annual temperature and annual precipitation for our study area and a generalized range-wide model projected through year 2086, for each of four carbon emission scenarios (representative concentration pathways RCP2.6, RCP4.5, RCP6.0, RCP8.5. Each population simulation was repeated 10,000 times. Of the 8 Boulder Model simulations, 1 increased (+29.10%, 3 stayed approximately stable (+2.45%, +0.05%, -0.03%, and 4 simulations decreased substantially (-44.10%, -44.70%, -44.95%, -78.85%. All General Model simulations for western North America decreased by >90% (-93.75%, -96.70%, -96.70%, -98.75%. These results suggest that a changing climate in western North America has the potential to quickly erode some forest bat populations including species of conservation concern, such as fringed myotis.

  1. Integrated system dynamics toolbox for water resources planning.

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.; Peplinski, William J.; Tidwell, Vincent Carroll; Coursey, Don (University of Chicago, Chicago, IL); Hanson, Jason (University of New Mexico, Albuquerque, NM); Grimsrud, Kristine (University of New Mexico, Albuquerque, NM); Thacher, Jennifer (University of New Mexico, Albuquerque, NM); Broadbent, Craig (University of New Mexico, Albuquerque, NM); Brookshire, David (University of New Mexico, Albuquerque, NM); Chemak, Janie (University of New Mexico, Albuquerque, NM); Cockerill, Kristan (Cockeril Consulting, Boone, NC); Aragon, Carlos (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Hallett, Heather (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Vivoni, Enrique (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Roach, Jesse

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associated processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward

  2. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    Science.gov (United States)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  3. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    Science.gov (United States)

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  4. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    Science.gov (United States)

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  5. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    Science.gov (United States)

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    Directory of Open Access Journals (Sweden)

    M. S. Mizielinski

    2014-08-01

    Full Text Available The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3 atmosphere-only global climate simulations over the period 1985–2011, at resolutions of N512 (25 km, N216 (60 km and N96 (130 km as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe in 2012, with additional resources supplied by the Natural Environment Research Council (NERC and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS, and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  7. Towards the petaflop for Lattice QCD simulations the PetaQCD project

    International Nuclear Information System (INIS)

    D'Auriac, Jean-Christian Angles; Carbonell, Jaume; Barthou, Denis

    2010-01-01

    The study and design of a very ambitious petaflop cluster exclusively dedicated to Lattice QCD simulations started in early '08 among a consortium of 7 laboratories (IN2P3, CNRS, INRIA, CEA) and 2 SMEs. This consortium received a grant from the French ANR agency in July '08, and the PetaQCD project kickoff took place in January '09. Building upon several years of fruitful collaborative studies in this area, the aim of this project is to demonstrate that the simulation of a 256 x 128 3 lattice can be achieved through the HMC/ETMC software, using a machine with efficient speed/cost/reliability/power consumption ratios. It is expected that this machine can be built out of a rather limited number of processors (e.g. between 1000 and 4000), although capable of a sustained petaflop CPU performance. The proof-of-concept should be a mock-up cluster built as much as possible with off-the-shelf components, and 2 particularly attractive axis will be mainly investigated, in addition to fast all-purpose multi-core processors: the use of the new brand of IBM-Cell processors (with on-chip accelerators) and the very recent Nvidia GP-GPUs (off-chip co-processors). This cluster will obviously be massively parallel, and heterogeneous. Communication issues between processors, implied by the Physics of the simulation and the lattice partitioning, will certainly be a major key to the project.

  8. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  9. Joint implementation - project simulation and organisation. Operationalization of a new instrument of climate policy

    International Nuclear Information System (INIS)

    Luhmann, H.J.; Ott, H.E.; Beuermann, C.; Fischedick, M.; Hennicke, P.; Bakker, L.

    1997-01-01

    The study served to analyze the practical aspects of 'Joint Implementation' (JI) under the Framework Convention on Climate Change, in which measures to reduce greenhouse gas emissions are carried out jointly by several countries. As a first step, possible climate protection measures were assessed with respect to their suitability for JI and an overview of suitable JI projects was compiled. In a further step, carried out in cooperation with partners from industry, four specific projects (coal-fired power plant, solar-thermal power plant, cement factory, least-cost planning) were used to simulate JI. In this work, solutions were developed for the calculation of greenhouse gas emissions avoided as well as for project reporting. In addition, proposals were made with respect to the organizational and institutional design on an international JI mechanism. (orig.) [de

  10. Neutron tomography using projection data obtained by Monte Carlo simulation for nondestructive evaluation

    International Nuclear Information System (INIS)

    Silva, A.X. da; Crispim, V.R.

    2002-01-01

    This work present the application of a computer package for generating of projection data for neutron computerized tomography, and in second part, discusses an application of neutron tomography, using the projection data obtained by Monte Carlo technique, for the detection and localization of light materials such as those containing hydrogen, concealed by heavy materials such as iron and lead. For tomographic reconstructions of the samples simulated use was made of only six equal projection angles distributed between 0 deg C and 180 deg C, with reconstruction making use of an algorithm (ARIEM), based on the principle of maximum entropy. With the neutron tomography it was possible to detect and locate polyethylene and water hidden by lead and iron (with 1 cm-thick). Thus, it is demonstrated that thermal neutrons tomography is a viable test method which can provide important interior information about test components, so, extremely useful in routine industrial applications.(author)

  11. A simulation study on the focal plane detector of the LAUE project

    International Nuclear Information System (INIS)

    Khalil, M.; Frontera, F.; Caroli, E.; Virgilli, E.; Valsan, V.

    2015-01-01

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium). - Highlights: • The quantized Hall plateaus and Shubnikov de Haas oscillations in transition metal doped topological insulators are observed. • The evidence of a two-dimensional/layered transport of the bulk electrons is reported. • An obvious ferromagnetism in doped topological insulators is observed. • Care should be taken to pindown the transport of the topological SS in topological insulators

  12. A simulation study on the focal plane detector of the LAUE project

    Energy Technology Data Exchange (ETDEWEB)

    Khalil, M., E-mail: mkhalil@in2p3.fr [APC Laboratory, 10 rue Alice Domon et Léonie Duquet, 75205 Paris Cedex 13 (France); Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); Frontera, F. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Caroli, E. [INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Virgilli, E.; Valsan, V. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy)

    2015-06-21

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium). - Highlights: • The quantized Hall plateaus and Shubnikov de Haas oscillations in transition metal doped topological insulators are observed. • The evidence of a two-dimensional/layered transport of the bulk electrons is reported. • An obvious ferromagnetism in doped topological insulators is observed. • Care should be taken to pindown the transport of the topological SS in topological insulators.

  13. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  14. Engineering and training simulators: A combined approach for nuclear plant construction projects

    International Nuclear Information System (INIS)

    Harnois, Olivier; Gain, Pascal; Bartak, Jan; Gathmann, Ralf

    2007-01-01

    Full text: Simulation technologies have always been widely used on nuclear applications, but with a clear division between engineering application, using highly validated code run in batch mode, and training purpose where real time computation is a mandatory requirement. Thanks to the flexibility of modern simulation technology and the increased performance of computers, it becomes now possible to develop Nuclear Power plant simulators that can be used both for engineering and training purposes. In the last years, the revival of nuclear industry raised a number of new construction or plant finishing projects in which the application of this combined approach would result in decisive improvement on plant construction lead times, better project control and cost optimizations. The simulator development is to be executed in a step-wise approach, scheduled in parallel with the plant design and construction phases. During a first step, the simulator will model the plant nuclear island systems plus the corresponding instrumentation and control, specific malfunctions and local commands. It can then be used for engineering activities defining and validating the plant operating strategies in case of incidents or accidents. The Simulator executive Station and Operator Station will be in prototype version with an interface imagery enabling monitoring and control of the simulator. Availability of such simulation platform leads to a significant increase in efficiency of the engineering works, the possibility to validate basic design hypotheses and detect defects and conflicts early. The second phase will consist in the fully detailed simulation of Main Control Room plant supervision and control MMI, taking into account I and C control loops detailed design improvement, while having sufficient fidelity in order to be suitable for the future operator training. Its use will enable the engineering units not only to specify and validate normal, incident and accident detailed plant

  15. Simulation studies for a high resolution time projection chamber at the international linear collider

    Energy Technology Data Exchange (ETDEWEB)

    Muennich, A.

    2007-03-26

    The International Linear Collider (ILC) is planned to be the next large accelerator. The ILC will be able to perform high precision measurements only possible at the clean environment of electron positron collisions. In order to reach this high accuracy, the requirements for the detector performance are challenging. Several detector concepts are currently under study. The understanding of the detector and its performance will be crucial to extract the desired physics results from the data. To optimise the detector design, simulation studies are needed. Simulation packages like GEANT4 allow to model the detector geometry and simulate the energy deposit in the different materials. However, the detector response taking into account the transportation of the produced charge to the readout devices and the effects ofthe readout electronics cannot be described in detail. These processes in the detector will change the measured position of the energy deposit relative to the point of origin. The determination of this detector response is the task of detailed simulation studies, which have to be carried out for each subdetector. A high resolution Time Projection Chamber (TPC) with gas amplification based on micro pattern gas detectors, is one of the options for the main tracking system at the ILC. In the present thesis a detailed simulation tool to study the performance of a TPC was developed. Its goal is to find the optimal settings to reach an excellent momentum and spatial resolution. After an introduction to the present status of particle physics and the ILC project with special focus on the TPC as central tracker, the simulation framework is presented. The basic simulation methods and implemented processes are introduced. Within this stand-alone simulation framework each electron produced by primary ionisation is transferred through the gas volume and amplified using Gas Electron Multipliers (GEMs). The output format of the simulation is identical to the raw data from a

  16. PROJECTED PRECIPITATION CHANGES IN CENTRAL/EASTERN EUROPE ON THE BASIS OF ENSEMBLE SIMULATIONS

    Directory of Open Access Journals (Sweden)

    Erika Miklos

    2012-03-01

    Full Text Available Projected precipitation changes in Central/Eastern Europe on the basis of ENSEMBLE simulations. For building appropriate local/national adaptation and mitigation strategies, detailed analysis of regional climate change is essential. In order to estimate the climate change for the 21st century, both global and regional models may be used. However, due to the coarse horizontal resolution, global climate models are not appropriate to describe regional scale climate processes. On the other hand, regional climate models (RCMs provide more realistic regional climate scenarios. A wide range of RCM experiments was accomplished in the frame of the ENSEMBLES project funded by the EU FP6 program, which was one of the largest climate change research project ever completed. All the RCM experiments used 25 km horizontal resolution and the A1B emission scenario, according to which CO2 concentration by 2100 is estimated to exceed 700 ppm, i.e., more than twice of the preindustrial level.The 25 km spatial resolution is fine enough to estimate the future hydrology-related conditions in different parts of Europe, from which we separated and analyzed simulated climate data sets for the Central/Eastern European region. Precipitation is an especially important climatological variable because of agricultural aspects and flood-related natural hazards, which may seriously affect all the countries in the evaluated region. On the basis of our results, different RCM simulations generally project drier summers and wetter winters (compared to the recent decades. The southern countries are more likely to suffer more intense warming, especially, in summer, and also, more intense drought events due to the stronger Mediterranean impact.

  17. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    Science.gov (United States)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  18. Simulations of Aperture Synthesis Imaging Radar for the EISCAT_3D Project

    Science.gov (United States)

    La Hoz, C.; Belyey, V.

    2012-12-01

    EISCAT_3D is a project to build the next generation of incoherent scatter radars endowed with multiple 3-dimensional capabilities that will replace the current EISCAT radars in Northern Scandinavia. Aperture Synthesis Imaging Radar (ASIR) is one of the technologies adopted by the EISCAT_3D project to endow it with imaging capabilities in 3-dimensions that includes sub-beam resolution. Complemented by pulse compression, it will provide 3-dimensional images of certain types of incoherent scatter radar targets resolved to about 100 metres at 100 km range, depending on the signal-to-noise ratio. This ability will open new research opportunities to map small structures associated with non-homogeneous, unstable processes such as aurora, summer and winter polar radar echoes (PMSE and PMWE), Natural Enhanced Ion Acoustic Lines (NEIALs), structures excited by HF ionospheric heating, meteors, space debris, and others. To demonstrate the feasibility of the antenna configurations and the imaging inversion algorithms a simulation of synthetic incoherent scattering data has been performed. The simulation algorithm incorporates the ability to control the background plasma parameters with non-homogeneous, non-stationary components over an extended 3-dimensional space. Control over the positions of a number of separated receiving antennas, their signal-to-noise-ratios and arriving phases allows realistic simulation of a multi-baseline interferometric imaging radar system. The resulting simulated data is fed into various inversion algorithms. This simulation package is a powerful tool to evaluate various antenna configurations and inversion algorithms. Results applied to realistic design alternatives of EISCAT_3D will be described.

  19. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  20. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  1. A toolbox for miRNA analysis

    Czech Academy of Sciences Publication Activity Database

    Svoboda, Petr

    2015-01-01

    Roč. 589, č. 14 (2015), s. 1694-1701 ISSN 0014-5793 R&D Projects: GA MŠk LO1419 Institutional support: RVO:68378050 Keywords : MicroRNA * Antagomir * Inhibition * Reporter * Locked nucleic acids Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.519, year: 2015

  2. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    Science.gov (United States)

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  3. The panacea toolbox of a PhD biomedical student.

    Science.gov (United States)

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  4. Tadarida: A Toolbox for Animal Detection on Acoustic Recordings

    Directory of Open Access Journals (Sweden)

    Yves Bas

    2017-02-01

    Full Text Available Passive Acoustic Monitoring (PAM recently extended to a very wide range of animals, but no available open software has been sufficiently generic to automatically treat several taxonomic groups. Here we present Tadarida, a software toolbox allowing for the detection and labelling of recorded sound events, and to classify any new acoustic data into known classes. It is made up of three modules handling Detection, Labelling and Classification and running on either Linux or Windows. This development resulted in the first open software (1 allowing generic sound event detection (multi-taxa, (2 providing graphical sound labelling at a single-instance level and (3 covering the whole process from sound detection to classification. This generic and modular design opens numerous reuse opportunities among (bioacoustics researchers, especially for those managing and/or developing PAM schemes. The whole toolbox is openly developed in C++ (Detection and Labelling and R (Classification and stored at https://github.com/YvesBas.

  5. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    Science.gov (United States)

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  6. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States)

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  7. Sentinel-3 SAR Altimetry Toolbox - Scientific Exploitation of Operational Missions (SEOM) Program Element

    Science.gov (United States)

    Benveniste, Jérôme; Lucas, Bruno; Dinardo, Salvatore

    2014-05-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage pioneered by ERS-1, ERS-2, Envisat and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the Sentinel-3 series is planned for launch in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales, the French Space Agency), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as net

  8. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  9. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    Science.gov (United States)

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  10. The APOSTLE project: Local Group kinematic mass constraints and simulation candidate selection

    Science.gov (United States)

    Fattahi, Azadeh; Navarro, Julio F.; Sawala, Till; Frenk, Carlos S.; Oman, Kyle A.; Crain, Robert A.; Furlong, Michelle; Schaller, Matthieu; Schaye, Joop; Theuns, Tom; Jenkins, Adrian

    2016-03-01

    We use a large sample of isolated dark matter halo pairs drawn from cosmological N-body simulations to identify candidate systems whose kinematics match that of the Local Group (LG) of galaxies. We find, in agreement with the `timing argument' and earlier work, that the separation and approach velocity of the Milky Way (MW) and Andromeda (M31) galaxies favour a total mass for the pair of ˜5 × 1012 M⊙. A mass this large, however, is difficult to reconcile with the small relative tangential velocity of the pair, as well as with the small deceleration from the Hubble flow observed for the most distant LG members. Halo pairs that match these three criteria have average masses a factor of ˜2 times smaller than suggested by the timing argument, but with large dispersion. Guided by these results, we have selected 12 halo pairs with total mass in the range 1.6-3.6 × 1012 M⊙ for the APOSTLE project (A Project Of Simulating The Local Environment), a suite of hydrodynamical resimulations at various numerical resolution levels (reaching up to ˜104 M⊙ per gas particle) that use the subgrid physics developed for the EAGLE project. These simulations reproduce, by construction, the main kinematics of the MW-M31 pair, and produce satellite populations whose overall number, luminosities, and kinematics are in good agreement with observations of the MW and M31 companions. The APOSTLE candidate systems thus provide an excellent testbed to confront directly many of the predictions of the Λ cold dark matter cosmology with observations of our local Universe.

  11. fMRI Artefact Rejection and Sleep Scoring Toolbox

    Directory of Open Access Journals (Sweden)

    Yves Leclercq

    2011-01-01

    Full Text Available We started writing the “fMRI artefact rejection and sleep scoring toolbox”, or “FAST”, to process our sleep EEG-fMRI data, that is, the simultaneous recording of electroencephalographic and functional magnetic resonance imaging data acquired while a subject is asleep. FAST tackles three crucial issues typical of this kind of data: (1 data manipulation (viewing, comparing, chunking, etc. of long continuous M/EEG recordings, (2 rejection of the fMRI-induced artefact in the EEG signal, and (3 manual sleep-scoring of the M/EEG recording. Currently, the toolbox can efficiently deal with these issues via a GUI, SPM8 batching system or hand-written script. The tools developed are, of course, also useful for other EEG applications, for example, involving simultaneous EEG-fMRI acquisition, continuous EEG eye-balling, and manipulation. Even though the toolbox was originally devised for EEG data, it will also gracefully handle MEG data without any problem. “FAST” is developed in Matlab as an add-on toolbox for SPM8 and, therefore, internally uses its SPM8-meeg data format. “FAST” is available for free, under the GNU-GPL.

  12. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  13. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  14. Simulated hydrologic response to projected changes in precipitation and temperature in the Congo River basin

    Directory of Open Access Journals (Sweden)

    N. Aloysius

    2017-08-01

    Full Text Available Despite their global significance, the impacts of climate change on water resources and associated ecosystem services in the Congo River basin (CRB have been understudied. Of particular need for decision makers is the availability of spatial and temporal variability of runoff projections. Here, with the aid of a spatially explicit hydrological model forced with precipitation and temperature projections from 25 global climate models (GCMs under two greenhouse gas emission scenarios, we explore the variability in modeled runoff in the near future (2016–2035 and mid-century (2046–2065. We find that total runoff from the CRB is projected to increase by 5 % [−9 %; 20 %] (mean – min and max – across model ensembles over the next two decades and by 7 % [−12 %; 24 %] by mid-century. Projected changes in runoff from subwatersheds distributed within the CRB vary in magnitude and sign. Over the equatorial region and in parts of northern and southwestern CRB, most models project an overall increase in precipitation and, subsequently, runoff. A simulated decrease in precipitation leads to a decline in runoff from headwater regions located in the northeastern and southeastern CRB. Climate model selection plays an important role in future projections for both magnitude and direction of change. The multimodel ensemble approach reveals that precipitation and runoff changes under business-as-usual and avoided greenhouse gas emission scenarios (RCP8.5 vs. RCP4.5 are relatively similar in the near term but deviate in the midterm, which underscores the need for rapid action on climate change adaptation. Our assessment demonstrates the need to include uncertainties in climate model and emission scenario selection during decision-making processes related to climate change mitigation and adaptation.

  15. Simulated hydrologic response to projected changes in precipitation and temperature in the Congo River basin

    Science.gov (United States)

    Aloysius, Noel; Saiers, James

    2017-08-01

    Despite their global significance, the impacts of climate change on water resources and associated ecosystem services in the Congo River basin (CRB) have been understudied. Of particular need for decision makers is the availability of spatial and temporal variability of runoff projections. Here, with the aid of a spatially explicit hydrological model forced with precipitation and temperature projections from 25 global climate models (GCMs) under two greenhouse gas emission scenarios, we explore the variability in modeled runoff in the near future (2016-2035) and mid-century (2046-2065). We find that total runoff from the CRB is projected to increase by 5 % [-9 %; 20 %] (mean - min and max - across model ensembles) over the next two decades and by 7 % [-12 %; 24 %] by mid-century. Projected changes in runoff from subwatersheds distributed within the CRB vary in magnitude and sign. Over the equatorial region and in parts of northern and southwestern CRB, most models project an overall increase in precipitation and, subsequently, runoff. A simulated decrease in precipitation leads to a decline in runoff from headwater regions located in the northeastern and southeastern CRB. Climate model selection plays an important role in future projections for both magnitude and direction of change. The multimodel ensemble approach reveals that precipitation and runoff changes under business-as-usual and avoided greenhouse gas emission scenarios (RCP8.5 vs. RCP4.5) are relatively similar in the near term but deviate in the midterm, which underscores the need for rapid action on climate change adaptation. Our assessment demonstrates the need to include uncertainties in climate model and emission scenario selection during decision-making processes related to climate change mitigation and adaptation.

  16. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal

    Science.gov (United States)

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation. PMID:23966804

  17. Simulant composition for the Mixed Waste Management Facility (MWMF) groundwater remediation project

    International Nuclear Information System (INIS)

    Siler, J.L.

    1992-01-01

    A project has been initiated at the request of ER to study and remediate the groundwater contamination at the Mixed Waste Management Facility (MWMF). This water contains a wide variety of both inorganics (e.g., sodium) and organics (e.g., benzene, trichloroethylene). Most compounds are present in the ppB range, and certain components (e.g., trichloroethylene, silver) are present at concentrations that exceed the primary drinking water standards (PDWS). These compounds must be reduced to acceptable levels as per RCRA and CERCLA orders. This report gives a listing of the important constituents which are to be included in a simulant to model the MWMF aquifer. This simulant will be used to evaluate the feasibility of various state of the art separation/destruction processes for remediating the aquifer

  18. An application of MCMC simulation in mortality projection for populations with limited data

    Directory of Open Access Journals (Sweden)

    Jackie Li

    2014-01-01

    Full Text Available Objective: IIn this paper, we investigate the use of Bayesian modeling and Markov chain Monte Carlo (MCMC simulation, via the software WinBUGS, to project future mortality for populations with limited data. In particular, we adapt some extensions of the Lee-Carter method under the Bayesian framework to allow for situations in which mortality data are scarce. Our approach would be useful for certain developing nations that have not been regularly collecting death counts and population statistics. Inferences of the model estimates and forecasts can readily be drawn from the simulated samples. Information on another population resembling the population under study can be exploited and incorporated into the prior distributions in order to facilitate the construction of probability intervals. The two sets of data can also be modeled in a joint manner. We demonstrate an application of this approach to some data from China and Taiwan.

  19. Prognostic simulation of reinjection-research project geothermal site Neustadt-Glewe/Germany

    Energy Technology Data Exchange (ETDEWEB)

    Poppei, J. [Geothermie Neubrandenburg GmbH (Germany)

    1995-03-01

    For the first time after political and economical changes in Germany a hydrothermal site was put into operation in December 1994. Due to prevailing conditions extraordinary in Central Europe (reservoir temperature 99{degrees}C; 220 g/l salinity) the project Neustadt-Glewe is supported by a comprehensive research program. The wells concerned (a doublet with an internal distance of 1.400 m) open the porous sandstone aquifer with an average thickness of about 53 m in a depth of 2.240m. One point of interest was the pressure and temperature behavior over a period of 10 years considering the fluid viscosity changes due to variable injection temperature. For means of reservoir simulation and prognosing the injection behavior and simulator code TOUGH2 was used.

  20. Design and simulation of the rotating test rig in the INDUFLAP project

    DEFF Research Database (Denmark)

    Barlas, Thanasis K.; Aagaard Madsen, Helge; Løgstrup Andersen, Tom

    The general description and objectives of the rotating test rig at the Risø campus of DTU are presented, as used for the aeroelastic testing of a controllable rubber trailing edge flap (CRTEF) system in the INDUFLAP project. The design of all new components is presented, including the electrical...... drive, the pitch system, the boom, and the wing/flap section. The overall instrumentation of the components used for the aeroelastic testing is described. Moreover, the aeroelastic model simulating the setup is described, and predictions of steady and dynamic loading along with the aeroelastic analysis...

  1. Study and application of microscopic depletion model in core simulator of COSINE project

    International Nuclear Information System (INIS)

    Hu Xiaoyu; Wang Su; Yan Yuhang; Liu Zhanquan; Chen Yixue; Huang Kai

    2013-01-01

    Microscopic depletion correction is one of the commonly used techniques that could improve the historical effect and attain higher precision of diffusion calculation and alleviate the inaccuracy caused by historical effect. Core simulator of COSINE project (core and system integrated engine for design and analysis) has developed a hybrid macroscopic-microscopic depletion model to track important isotopes during each depletion history and correct the macro cross sections. The basic theory was discussed in this paper. The effect and results of microscopic depletion correction were also analyzed. The preliminary test results demonstrate that the microscopic depletion model is effective and practicable for improving the precision of core calculation. (authors)

  2. Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox

    DEFF Research Database (Denmark)

    Casarin, Roberto; Grassi, Stefano; Ravazzolo, Francesco

    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights...... for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy...... times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications....

  3. Coupled Model Intercomparison Project 5 (CMIP5) simulations of climate following volcanic eruptions

    KAUST Repository

    Driscoll, Simon; Bozzo, Alessio; Gray, Lesley J.; Robock, Alan; Stenchikov, Georgiy L.

    2012-01-01

    The ability of the climate models submitted to the Coupled Model Intercomparison Project 5 (CMIP5) database to simulate the Northern Hemisphere winter climate following a large tropical volcanic eruption is assessed. When sulfate aerosols are produced by volcanic injections into the tropical stratosphere and spread by the stratospheric circulation, it not only causes globally averaged tropospheric cooling but also a localized heating in the lower stratosphere, which can cause major dynamical feedbacks. Observations show a lower stratospheric and surface response during the following one or two Northern Hemisphere (NH) winters, that resembles the positive phase of the North Atlantic Oscillation (NAO). Simulations from 13 CMIP5 models that represent tropical eruptions in the 19th and 20th century are examined, focusing on the large-scale regional impacts associated with the large-scale circulation during the NH winter season. The models generally fail to capture the NH dynamical response following eruptions. They do not sufficiently simulate the observed post-volcanic strengthened NH polar vortex, positive NAO, or NH Eurasian warming pattern, and they tend to overestimate the cooling in the tropical troposphere. The findings are confirmed by a superposed epoch analysis of the NAO index for each model. The study confirms previous similar evaluations and raises concern for the ability of current climate models to simulate the response of a major mode of global circulation variability to external forcings. This is also of concern for the accuracy of geoengineering modeling studies that assess the atmospheric response to stratosphere-injected particles.

  4. Coupled Model Intercomparison Project 5 (CMIP5) simulations of climate following volcanic eruptions

    KAUST Repository

    Driscoll, Simon

    2012-09-16

    The ability of the climate models submitted to the Coupled Model Intercomparison Project 5 (CMIP5) database to simulate the Northern Hemisphere winter climate following a large tropical volcanic eruption is assessed. When sulfate aerosols are produced by volcanic injections into the tropical stratosphere and spread by the stratospheric circulation, it not only causes globally averaged tropospheric cooling but also a localized heating in the lower stratosphere, which can cause major dynamical feedbacks. Observations show a lower stratospheric and surface response during the following one or two Northern Hemisphere (NH) winters, that resembles the positive phase of the North Atlantic Oscillation (NAO). Simulations from 13 CMIP5 models that represent tropical eruptions in the 19th and 20th century are examined, focusing on the large-scale regional impacts associated with the large-scale circulation during the NH winter season. The models generally fail to capture the NH dynamical response following eruptions. They do not sufficiently simulate the observed post-volcanic strengthened NH polar vortex, positive NAO, or NH Eurasian warming pattern, and they tend to overestimate the cooling in the tropical troposphere. The findings are confirmed by a superposed epoch analysis of the NAO index for each model. The study confirms previous similar evaluations and raises concern for the ability of current climate models to simulate the response of a major mode of global circulation variability to external forcings. This is also of concern for the accuracy of geoengineering modeling studies that assess the atmospheric response to stratosphere-injected particles.

  5. Cesium uptake capacity of simulated ferrocyanide tank waste. Interim report FY 1994, Ferrocyanide Safety Project

    International Nuclear Information System (INIS)

    Burgeson, I.E.; Bryan, S.A.; Burger, L.E.

    1994-09-01

    The objective of this project is to determine the capacity for 137 CS uptake by mixed metal ferrocyanides present in Hanford waste tanks, and to assess the potential for aggregation of these 137 CS exchanged materials to form tank ''hot-spots.'' This research, performed at the Pacific Northwest Laboratory (PNL) for the Westinghouse Hanford Company (WHC), stems from concerns of possible localized radiolytic heating within the tanks. If radioactive cesium is exchanged and concentrated by the remaining nickel ferrocyanide present in the tanks, this heating could cause temperatures to rise above the safety limits specified for the ferrocyanide tanks. For the purposes of this study, two simulants, In-Farm-2 and U-Plant-2, were chosen to represent the wastes generated by the scavenging processes. These simulants were formulated using protocols from the original cesium scavenging campaign. Later additions of cesium-rich wastes from various processes also were considered. The simulants were prepared and centrifuged to obtain a moist ferrocyanide sludge. The centrifuged sludges were treated with the original supernate spiked with a known amount of cesium nitrate. After analysis by flame atomic absorption spectrometry, distribution coefficients (K d ) were calculated. The capacity of solid waste simulants to exchange radioactive cesium from solution was examined. Initial results showed that the greater the molar ratio of cesium to cesium nickel ferrocyanide, the less effective the exchange of cesium from solution. The theoretical capacity of 2 mol cesium per mol of nickel ferrocyanide was not observed. The maximum capacity under experimental conditions was 0.35 mol cesium per mol nickel ferrocyanide. Future work on this project will examine the layering tendency of the cesium nickel ferrocyanide species

  6. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    Science.gov (United States)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  7. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies.

    Science.gov (United States)

    Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross

    2016-06-01

    To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.

  8. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  9. Rainfall and Extratropical Transition of Tropical Cyclones: Simulation, Prediction, and Projection

    Science.gov (United States)

    Liu, Maofeng

    Rainfall and associated flood hazards are one of the major threats of tropical cyclones (TCs) to coastal and inland regions. The interaction of TCs with extratropical systems can lead to enhanced precipitation over enlarged areas through extratropical transition (ET). To achieve a comprehensive understanding of rainfall and ET associated with TCs, this thesis conducts weather-scale analyses by focusing on individual storms and climate-scale analyses by focusing on seasonal predictability and changing properties of climatology under global warming. The temporal and spatial rainfall evolution of individual storms, including Hurricane Irene (2011), Hurricane Hanna (2008), and Hurricane Sandy (2012), is explored using the Weather Research and Forecast (WRF) model and a variety of hydrometeorological datasets. ET and Orographic mechanism are two key players in the rainfall distribution of Irene over regions experiencing most severe flooding. The change of TC rainfall under global warming is explored with the Forecast-oriented Low Ocean Resolution (FLOR) climate model under representative concentration pathway (RCP) 4.5 scenario. Despite decreased TC frequency, FLOR projects increased landfalling TC rainfall over most regions of eastern United States, highlighting the risk of increased flood hazards. Increased storm rain rate is an important player of increased landfalling TC rainfall. A higher atmospheric resolution version of FLOR (HiFLOR) model projects increased TC rainfall at global scales. The increase of TC intensity and environmental water vapor content scaled by the Clausius-Clapeyron relation are two key factors that explain the projected increase of TC rainfall. Analyses on the simulation, prediction, and projection of the ET activity with FLOR are conducted in the North Atlantic. FLOR model exhibits good skills in simulating many aspects of present-day ET climatology. The 21st-century-projection under RCP4.5 scenario demonstrates the dominant role of ET

  10. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. BIM-Based 4D Simulation to Improve Module Manufacturing Productivity for Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Joosung Lee

    2017-03-01

    Full Text Available Modular construction methods, where products are manufactured beforehand in a factory and then transported to the site for installation, are becoming increasingly popular for construction projects in many countries as this method facilitates the use of the advanced technologies that support sustainability in building projects. This approach requires dual factory–site process management to be carefully coordinated and the factory module manufacturing process must therefore be managed in a detailed and quantitative manner. However, currently, the limited algorithms available to support this process are based on mathematical methodologies that do not consider the complex mix of equipment, factories, personnel, and materials involved. This paper presents three new building information modeling-based 4D simulation frameworks to manage the three elements—process, quantity, and quality—that determine the productivity of factory module manufacturing. These frameworks leverage the advantages of 4D simulation and provide more precise information than existing conventional documents. By utilizing a 4D model that facilitates the visualization of a wide range of data variables, manufacturers can plan the module manufacturing process in detail and fully understand the material, equipment, and workflow needed to accomplish the manufacturing tasks. Managers can also access information about material quantities for each process and use this information for earned value management, warehousing/storage, fabrication, and assembly planning. By having a 4D view that connects 2D drawing models, manufacturing errors and rework can be minimized and problems such as construction delays, quality lapses, and cost overruns vastly reduced.

  12. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    Science.gov (United States)

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  13. A Preliminary Study on Time Projection Chamber Simulation for Fission Cross Section Measurements with Geant4

    International Nuclear Information System (INIS)

    Kim, Jong Woon; Lee, Youngouk; Kim, Jae Cheon

    2014-01-01

    We present the details of the TPC simulation with Geant4 and show the results. TPC can provide more information than a fission chamber in that it is possible to distinguish different particle types. Simulations are conducted for uranium and plutonium targets with 20MeV neutrons. The simulation results are compared with the reference and show reasonable results. This is the first phase of study for realizing a TPC in the NFS at RAON, and we have more work to do, such as applying an electric field, signal processing in the simulation, and manufacturing of a TPC. The standard in fission cross section measurement is a fission chamber. It is basically just two parallel plates separated by a few centimeters of gas. A power supply connected to the plates sets up a moderate electric field. The target is deposited onto one of the plates. When fission occurs, the fragments ionize the gas, and the electric field causes the produced electrons to drift to the opposite plate, which records the total energy deposited in the chamber. A Time Projection Chamber (TPC) is a gas ionization detector similar to a fission chamber. However, it can measure the charged particle trajectories in the active volume in three dimensions by adding several readouts on the pad plane (fission chamber has only one readout one a pad plane). The specific ionization for each particle track enables the TPC to distinguish different particle types. A TPC will be used for fission cross section measurements in the Neutron Science Facility (NSF) at RAON. As a preliminary study, we present details of TPC simulation with Geant4 and discuss the results

  14. Use of Simulation in Nursing Education: Initial Experiences on a European Union Lifelong Learning Programme--Leonardo Da Vinci Project

    Science.gov (United States)

    Terzioglu, Fusun; Tuna, Zahide; Duygulu, Sergul; Boztepe, Handan; Kapucu, Sevgisun; Ozdemir, Leyla; Akdemir, Nuran; Kocoglu, Deniz; Alinier, Guillaume; Festini, Filippo

    2013-01-01

    Aim: The aim of this paper is to share the initial experiences on a European Union (EU) Lifelong Learning Programme Leonardo Da Vinci Transfer of Innovation Project related to the use of simulation-based learning with nursing students from Turkey. The project started at the end of the 2010 involving 7 partners from 3 different countries including…

  15. Causal factors of low stakeholder engagement : A survey of expert opinions in the context of healthcare simulation projects

    NARCIS (Netherlands)

    Jahangirian, Mohsen; Borsci, Simone; Shah, Syed Ghulam Sarwar; Taylor, Simon J.E.

    2015-01-01

    While simulation methods have proved to be very effective in identifying efficiency gains, low stakeholder engagement creates a significant limitation on the achievement of simulation modeling projects in practice. This study reports causal factors—at two hierarchical levels (i.e., primary and

  16. Algal Biology Toolbox Workshop Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-08-01

    DOE-EERE's Bioenergy Technologies Office (BETO) works to accelerate the development of a sustainable, cost-competitive, advanced biofuel industry that can strengthen U.S. energy security, environmental quality, and economic vitality, through research, development, and demonstration projects in partnership with industry, academia, and national laboratory partners. BETO’s Advanced Algal Systems Program (also called the Algae Program) has a long-term applied research and development (R&D) strategy to increase the yields and lower the costs of algal biofuels. The team works with partners to develop new technologies, to integrate technologies at commercially relevant scales, and to conduct crosscutting analyses to better understand the potential and challenges of the algal biofuels industry. Research has indicated that this industry is capable of producing billions of gallons of renewable diesel, gasoline, and jet fuels annually. R&D activities are integrated with BETO’s longstanding effort to accelerate the commercialization of lignocellulosic biofuels.

  17. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies

    Energy Technology Data Exchange (ETDEWEB)

    Häggström, Ida, E-mail: haeggsti@mskcc.org [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 and Department of Radiation Sciences, Umeå University, Umeå 90187 (Sweden); Beattie, Bradley J.; Schmidtlein, C. Ross [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States)

    2016-06-15

    Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for

  18. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies

    International Nuclear Information System (INIS)

    Häggström, Ida; Beattie, Bradley J.; Schmidtlein, C. Ross

    2016-01-01

    Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for

  19. Projections of rising heat stress over the western Maritime Continent from dynamically downscaled climate simulations

    Science.gov (United States)

    Im, Eun-Soon; Kang, Suchul; Eltahir, Elfatih A. B.

    2018-06-01

    This study assesses the future changes in heat stress in response to different emission scenarios over the western Maritime Continent. To better resolve the region-specific changes and to enhance the performance in simulating extreme events, the MIT Regional Climate Model with a 12-km horizontal resolution is used for the dynamical downscaling of three carefully selected CMIP5 global projections forced by two Representative Concentration Pathway (RCP4.5 and RCP8.5) scenarios. Daily maximum wet-bulb temperature (TWmax), which includes the effect of humidity, is examined to describe heat stress as regulated by future changes in temperature and humidity. An ensemble of projections reveals robust pattern in which a large increase in temperature is accompanied by a reduction in relative humidity but a significant increase in wet-bulb temperature. This increase in TWmax is relatively smaller over flat and coastal regions than that over mountainous region. However, the flat and coastal regions characterized by warm and humid present-day climate will be at risk even under modest increase in TWmax. The regional extent exposed to higher TWmax and the number of days on which TWmax exceeds its threshold value are projected to be much higher in RCP8.5 scenario than those in RCP4.5 scenario, thus highlighting the importance of controlling greenhouse gas emissions to reduce the adverse impacts on human health and heat-related mortality.

  20. Simulation technology used for risky assessment in deep exploration project in China

    Science.gov (United States)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  1. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA project: a novel bioengineering goal

    Directory of Open Access Journals (Sweden)

    Scaramuzzo RT

    2013-08-01

    Full Text Available Rosa T Scaramuzzo,1,2 Massimiliano Ciantelli,1 Ilaria Baldoli,3 Lisa Bellanti,3 Marzia Gentile,1 Francesca Cecchi,3 Emilio Sigali,1 Selene Tognarelli,3 Paolo Ghirri,1–4 Stefano Mazzoleni,3 Arianna Menciassi,3 Armando Cuttano,1 Antonio Boldrini,1–4 Cecilia Laschi,3 Paolo Dario3 1Centro di Formazione e Simulazione Neonatale "NINA," UO Neonatologia, Azienda Ospedaliera Universitaria Pisana, Pisa, Italy; 2Istituto di Scienze della Vita, 3The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy; 4Università di Pisa, Pisa, Italy Abstract: Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1 a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2 the prototyping phase; and (3 the on-field system validation. Keywords: simulation, lung, newborn, continuous medical education, respiratory system

  2. Three dimensional numeric quench simulation of Super-FRS dipole test coil for FAIR project

    International Nuclear Information System (INIS)

    Wu Wei; Ma Lizhen; He Yuan; Yuan Ping

    2013-01-01

    The prototype of superferric dipoles for Super-FRS of Facility for Antiprotons and Ion Research (FAIR) project was designed, fabricated, and tested in China. To investigate the performance of the superconducting coil, a so-called test coil was fabricated and tested in advance. A 3D model based on ANSYS and OPERA 3D was developed in parallel, not only to check if the design matches the numerical simulation, but also to study more details of quench phenomena. The model simplifies the epoxy impregnated coil into an anisotropic continuum medium. The simulation combines ANSYS solver routines for nonlinear transient thermal analysis, the OPERA 3D for magnetic field evaluation and the ANSYS script language for calculations of Joule heat and differential equations of the protection circuits. The time changes of temperature, voltage and current decay, and quench propagation during quench process were analyzed and illustrated. Finally, the test results of the test coil were demonstrated and compared with the results of simulation. (authors)

  3. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    Science.gov (United States)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  4. NPV risk simulation of an open pit gold mine project under the O'Hara cost model by using GAs

    Institute of Scientific and Technical Information of China (English)

    Franco-Sepulveda Giovanni; Campuzano Carlos; Pineda Cindy

    2017-01-01

    This paper analyzes an open pit gold mine project based on the O'Hara cost model. Hypothetical data is proposed based on different authors that have studied open pit gold projects, and variations are proposed according to the probability distributions associated to key variables affecting the NPV, like production level, ore grade, price of ore, and others, so as to see what if, in a gold open pit mine project of 3000 metric tons per day of ore. Two case scenarios were analyzed to simulate the NPV, one where there is low certainty data available, and the other where the information available is of high certainty. Results based on genetic algorithm metaheuristic simulations, which combine basically Montecarlo simulations provided by the Palisade Risk software, the O'Hara cost model, net smelter return and financial analysis tools offered by Excel are reported, in order to determine to which variables of the project is more sensitive the NPV.

  5. DIRT: The Dust InfraRed Toolbox

    Science.gov (United States)

    Pound, M. W.; Wolfire, M. G.; Mundy, L. G.; Teuben, P. J.; Lord, S.

    We present DIRT, a Java applet geared toward modeling a variety of processes in envelopes of young and evolved stars. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. The computing cluster for the database is described in the accompanying paper by Teuben et al. (2000). A typical user query will return about 50-100 models, which the user can then interactively filter as a function of 8 model parameters (e.g., extinction, size, flux, luminosity). A flexible, multi-dimensional plotter (Figure 1) allows users to view the models, rotate them, tag specific parameters with color or symbol size, and probe individual model points. For any given model, auxiliary plots such as dust grain properties, radial intensity profiles, and the flux as a function of wavelength and beamsize can be viewed. The user can fit observed data to several models simultaneously and see the results of the fit; the best fit is automatically selected for plotting. The URL for this project is http://dustem.astro.umd.edu.

  6. The Expanded Capabilities Of The Cementitious Barriers Partnership Software Toolbox Version 2.0 - 14331

    Energy Technology Data Exchange (ETDEWEB)

    Burns, Heather; Flach, Greg; Smith, Frank; Langton, Christine; Brown, Kevin; Kosson, David; Samson, Eric; Mallick, Pramod

    2014-01-10

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attack analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE

  7. Evaluation Toolbox: Ex-Ante Impact Assessment and Value Network Analysis for SI

    NARCIS (Netherlands)

    Dhondt, S.; Ven, H. van de; Cressey, P.; Kaderabkova, A.; Luna, Á.; Moghadam Saman, S.; Castro Spila, J.; Ziauberyte, R.; Torre, W. van der; Terstriep, J.

    2016-01-01

    This report contains a toolbox for use with the Ex-Ante Impact Assessment for social innovations as was developed in the report D7.1. This toolbox proposes a series of convenient and useful tools to apply in an ex-ante assessment of social innovation within SIMPACT's policy areas unemployment,

  8. FIT3D toolbox: multiple view geometry and 3D reconstruction for Matlab

    NARCIS (Netherlands)

    Esteban, I.; Dijk, J.; Groen, F.

    2010-01-01

    FIT3D is a Toolbox built for Matlab that aims at unifying and distributing a set of tools that will allow the researcher to obtain a complete 3D model from a set of calibrated images. In this paper we motivate and present the structure of the toolbox in a tutorial and example based approach. Given

  9. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  10. Simulation Study of Structure and Properties of Plasma Liners for the PLX- α Project

    Science.gov (United States)

    Samulyak, Roman; Shih, Wen; Hsu, Scott; PLX-Alpha Team

    2017-10-01

    Detailed numerical studies of the propagation and merger of high-Mach-number plasma jets and the formation and implosion of plasma liners have been performed using the FronTier code in support of the Plasma Liner Experiment-ALPHA (PLX- α) project. Physics models include radiation, physical diffusion, plasma-EOS models, and an anisotropic diffusion model that mimics deviations from fully collisional hydrodynamics in outer layers of plasma jets. Detailed structure and non-uniformity of plasma liners of due to primary and secondary shock waves have been studies as well as averaged quantities of ram pressure and Mach number. Synthetic data from simulations have been compared with available experimental data from a multi-chord interferometer and survey and high-resolution spectrometers. Numerical studies of the sensitivity of liner properties to experimental errors in the initial masses of jets and the synchronization of plasma gun valves have also been performed. Supported by the ARPA-E ALPHA program.

  11. Scientific computing and algorithms in industrial simulations projects and products of Fraunhofer SCAI

    CERN Document Server

    Schüller, Anton; Schweitzer, Marc

    2017-01-01

    The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...

  12. SPATIAL DATA MINING TOOLBOX FOR MAPPING SUITABILITY OF LANDFILL SITES USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    S. K. M. Abujayyab

    2016-09-01

    Full Text Available Mapping the suitability of landfill sites is a complex field and is involved with multidiscipline. The purpose of this research is to create an ArcGIS spatial data mining toolbox for mapping the suitability of landfill sites at a regional scale using neural networks. The toolbox is constructed from six sub-tools to prepare, train, and process data. The employment of the toolbox is straightforward. The multilayer perceptron (MLP neural networks structure with a backpropagation learning algorithm is used. The dataset is mined from the north states in Malaysia. A total of 14 criteria are utilized to build the training dataset. The toolbox provides a platform for decision makers to implement neural networks for mapping the suitability of landfill sites in the ArcGIS environment. The result shows the ability of the toolbox to produce suitability maps for landfill sites.

  13. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(SzGeCERN)377840; Fressard-Batraneanu, Silvia Maria; Ballestrero, Sergio; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    Abstract. During the LHC Long Shutdown 1 period (LS1), that started in 2013, the Simulation at Point1 (Sim@P1) Project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 virtual machines (VMs) provided with 8 CPU cores each, for a total of up to 22000 parallel running jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 Project; operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities. The design aspects a...

  14. Design, Results, Evolution and Status of the ATLAS simulation in Point1 project.

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Brasolin, Franco; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    During the LHC long shutdown period (LS1), that started in 2013, the simulation in Point1 (Sim@P1) project takes advantage in an opportunistic way of the trigger and data acquisition (TDAQ) farm of the ATLAS experiment. The farm provides more than 1500 computer nodes, and they are particularly suitable for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2500 virtual machines (VM) provided with 8 CPU cores each, for a total of up to 20000 parallel running jobs. This contribution gives a thorough review of the design, the results and the evolution of the Sim@P1 project operating a large scale Openstack based virtualized platform deployed on top of the ATLAS TDAQ farm computing resources. During LS1, Sim@P1 was one of the most productive GRID sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities within the ATLAS collaboration. The particular design ...

  15. Effects of Simulated Forest Cover Change on Projected Climate Change – a Case Study of Hungary

    Directory of Open Access Journals (Sweden)

    GÁLOS, Borbála

    2011-01-01

    Full Text Available Climatic effects of forest cover change have been investigated for Hungary applying theregional climate model REMO. For the end of the 21st century (2071–2100 case studies have beenanalyzed assuming maximal afforestation (forests covering all vegetated area and completedeforestation (forests replaced by grasslands of the country. For 2021–2025, the climatic influence ofthe potential afforestation based on a detailed national survey has been assessed. The simulationresults indicate that maximal afforestation may reduce the projected climate change through coolerand moister conditions for the entire summer period. The magnitude of the simulated climate changemitigating effect of the forest cover increase differs among regions. The smallest climatic benefit wascalculated in the southwestern region, in the area with the potentially strongest climate change. Thestrongest effects of maximal afforestation are expected in the northeastern part of the country. Here,half of the projected precipitation decrease could be relieved and the probability of summer droughtscould be reduced. The potential afforestation has a very slight feedback on the regional climatecompared to the maximal afforestation scenario.

  16. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.

    Science.gov (United States)

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M

    2009-09-30

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  17. Container cargo simulation modeling for measuring impacts of infrastructure investment projects in Pearl River Delta

    Science.gov (United States)

    Li, Jia-Qi; Shibasaki, Ryuichi; Li, Bo-Wei

    2010-03-01

    In the Pearl River Delta (PRD), there is severe competition between container ports, particularly those in Hong Kong, Shenzhen, and Guangzhou, for collecting international maritime container cargo. In addition, the second phase of the Nansha terminal in Guangzhou’s port and the first phase of the Da Chang Bay container terminal in Shenzhen opened last year. Under these circumstances, there is an increasing need to quantitatively measure the impact these infrastructure investments have on regional cargo flows. The analysis should include the effects of container terminal construction, berth deepening, and access road construction. The authors have been developing a model for international cargo simulation (MICS) which can simulate the movement of cargo. The volume of origin-destination (OD) container cargo in the East Asian region was used as an input, in order to evaluate the effects of international freight transportation policies. This paper focuses on the PRD area and, by incorporating a more detailed network, evaluates the impact of several infrastructure investment projects on freight movement.

  18. Collaborative Simulation and Testing of the Superconducting Dipole Prototype Magnet for the FAIR Project

    International Nuclear Information System (INIS)

    Zhu Yinfeng; Zhu Zhe; Wu Weiyue; Xu Houchang

    2012-01-01

    The superconducting dipole prototype magnet of the collector ring for the Facility for Antiproton and Ion Research (FAIR) is an international cooperation project. The collaborative simulation and testing of the developed prototype magnet is presented in this paper. To evaluate the mechanical strength of the coil case during quench, a 3-dimensional (3D) electromagnetic (EM) model was developed based on the solid97 magnetic vector element in the ANSYS commercial software, which includes the air region, coil and yoke. EM analysis was carried out with a peak operating current at 278 A. Then, the solid97 element was transferred into the solid185 element, the coupled analysis was switched from electromagnetic to structural, and the finite element model for the coil case and glass-fiber reinforced composite (G10) spacers was established by the ANSYS Parametric Design Language based on the 3D model from the CATIA V5 software. However, to simulate the friction characteristics inside the coil case, the conta173 surface-to-surface contact element was established. The results for the coil case and G10 spacers show that they are safe and have sufficient strength, on the basis of testing in discharge and quench scenarios. (fusion engineering)

  19. Integrated Approach to Drilling Project in Unconventional Reservoir Using Reservoir Simulation

    Science.gov (United States)

    Stopa, Jerzy; Wiśniowski, Rafał; Wojnarowski, Paweł; Janiga, Damian; Skrzypaszek, Krzysztof

    2018-03-01

    Accumulation and flow mechanisms in unconventional reservoir are different compared to conventional. This requires a special approach of field management with drilling and stimulation treatments as major factor for further production. Integrated approach of unconventional reservoir production optimization assumes coupling drilling project with full scale reservoir simulation for determine best well placement, well length, fracturing treatment design and mid-length distance between wells. Full scale reservoir simulation model emulate a part of polish shale - gas field. The aim of this paper is to establish influence of technical factor for gas production from shale gas field. Due to low reservoir permeability, stimulation treatment should be direct towards maximizing the hydraulic contact. On the basis of production scenarios, 15 stages hydraulic fracturing allows boost gas production over 1.5 times compared to 8 stages. Due to the possible interference of the wells, it is necessary to determine the distance between the horizontal parts of the wells trajectories. In order to determine the distance between the wells allowing to maximize recovery factor of resources in the stimulated zone, a numerical algorithm based on a dynamic model was developed and implemented. Numerical testing and comparative study show that the most favourable arrangement assumes a minimum allowable distance between the wells. This is related to the volume ratio of the drainage zone to the total volume of the stimulated zone.

  20. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    Science.gov (United States)

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  1. THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji-hoon [Kavli Institute for Particle Astrophysics and Cosmology, SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Agertz, Oscar [Department of Physics, University of Surrey, Guildford, Surrey, GU2 7XH (United Kingdom); Teyssier, Romain; Feldmann, Robert [Centre for Theoretical Astrophysics and Cosmology, Institute for Computational Science, University of Zurich, Zurich, 8057 (Switzerland); Butler, Michael J. [Max-Planck-Institut für Astronomie, D-69117 Heidelberg (Germany); Ceverino, Daniel [Zentrum für Astronomie der Universität Heidelberg, Institut für Theoretische Astrophysik, D-69120 Heidelberg (Germany); Choi, Jun-Hwan [Department of Astronomy, University of Texas, Austin, TX 78712 (United States); Keller, Ben W. [Department of Physics and Astronomy, McMaster University, Hamilton, ON L8S 4M1 (Canada); Lupi, Alessandro [Institut d’Astrophysique de Paris, Sorbonne Universites, UPMC Univ Paris 6 et CNRS, F-75014 Paris (France); Quinn, Thomas; Wallace, Spencer [Department of Astronomy, University of Washington, Seattle, WA 98195 (United States); Revaz, Yves [Institute of Physics, Laboratoire d’Astrophysique, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne (Switzerland); Gnedin, Nickolay Y. [Particle Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Leitner, Samuel N. [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Shen, Sijing [Kavli Institute for Cosmology, University of Cambridge, Cambridge, CB3 0HA (United Kingdom); Smith, Britton D., E-mail: me@jihoonkim.org [Institute for Astronomy, University of Edinburgh, Royal Observatory, Edinburgh EH9 3HJ (United Kingdom); Collaboration: AGORA Collaboration; and others

    2016-12-20

    Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, we find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt–Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∼3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.

  2. Greenland ice sheet surface mass balance: evaluating simulations and making projections with regional climate models

    Directory of Open Access Journals (Sweden)

    J. G. L. Rae

    2012-11-01

    Full Text Available Four high-resolution regional climate models (RCMs have been set up for the area of Greenland, with the aim of providing future projections of Greenland ice sheet surface mass balance (SMB, and its contribution to sea level rise, with greater accuracy than is possible from coarser-resolution general circulation models (GCMs. This is the first time an intercomparison has been carried out of RCM results for Greenland climate and SMB. Output from RCM simulations for the recent past with the four RCMs is evaluated against available observations. The evaluation highlights the importance of using a detailed snow physics scheme, especially regarding the representations of albedo and meltwater refreezing. Simulations with three of the RCMs for the 21st century using SRES scenario A1B from two GCMs produce trends of between −5.5 and −1.1 Gt yr−2 in SMB (equivalent to +0.015 and +0.003 mm sea level equivalent yr−2, with trends of smaller magnitude for scenario E1, in which emissions are mitigated. Results from one of the RCMs whose present-day simulation is most realistic indicate that an annual mean near-surface air temperature increase over Greenland of ~ 2°C would be required for the mass loss to increase such that it exceeds accumulation, thereby causing the SMB to become negative, which has been suggested as a threshold beyond which the ice sheet would eventually be eliminated.

  3. A part toolbox to tune genetic expression in Bacillus subtilis

    Science.gov (United States)

    Guiziou, Sarah; Sauveplane, Vincent; Chang, Hung-Ju; Clerté, Caroline; Declerck, Nathalie; Jules, Matthieu; Bonnet, Jerome

    2016-01-01

    Libraries of well-characterised components regulating gene expression levels are essential to many synthetic biology applications. While widely available for the Gram-negative model bacterium Escherichia coli, such libraries are lacking for the Gram-positive model Bacillus subtilis, a key organism for basic research and biotechnological applications. Here, we engineered a genetic toolbox comprising libraries of promoters, Ribosome Binding Sites (RBS), and protein degradation tags to precisely tune gene expression in B. subtilis. We first designed a modular Expression Operating Unit (EOU) facilitating parts assembly and modifications and providing a standard genetic context for gene circuits implementation. We then selected native, constitutive promoters of B. subtilis and efficient RBS sequences from which we engineered three promoters and three RBS sequence libraries exhibiting ∼14 000-fold dynamic range in gene expression levels. We also designed a collection of SsrA proteolysis tags of variable strength. Finally, by using fluorescence fluctuation methods coupled with two-photon microscopy, we quantified the absolute concentration of GFP in a subset of strains from the library. Our complete promoters and RBS sequences library comprising over 135 constructs enables tuning of GFP concentration over five orders of magnitude, from 0.05 to 700 μM. This toolbox of regulatory components will support many research and engineering applications in B. subtilis. PMID:27402159

  4. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  5. Improving Nursing Communication Skills in an Intensive Care Unit Using Simulation and Nursing Crew Resource Management Strategies: An Implementation Project.

    Science.gov (United States)

    Turkelson, Carman; Aebersold, Michelle; Redman, Richard; Tschannen, Dana

    Effective interprofessional communication is critical to patient safety. This pre-/postimplementation project used a multifaceted educational strategy with high-fidelity simulation to introduce evidence-based communication tools, adapted from Nursing Crew Resource Management, to intensive care unit nurses. Results indicated that participants were satisfied with the education, and their perceptions of interprofessional communication and knowledge improved. Teams (n = 16) that used the communication tools during simulation were more likely to identify the problem, initiate key interventions, and have positive outcomes.

  6. Final Technical Report for Project 'Improving the Simulation of Arctic Clouds in CCSM3 (SGER Award)'

    International Nuclear Information System (INIS)

    Vavrus, Stephen J.

    2008-01-01

    This project has focused on the simulation of Arctic clouds in CCSM3 and how the modeled cloud amount (and climate) can be improved substantially by altering the parameterized low cloud fraction. The new formula, dubbed 'freeezedry', alleviates the bias of excessive low clouds during polar winter by reducing the cloud amount under very dry conditions. During winter, freezedry decreases the low cloud amount over the coldest regions in high latitudes by over 50% locally and more than 30% averaged across the Arctic (Fig. 1). The cloud reduction causes an Arctic-wide drop of 15 W m -2 in surface cloud radiative forcing (CRF) during winter and about a 50% decrease in mean annual Arctic CRF. Consequently, wintertime surface temperatures fall by up to 4 K on land and 2-8 K over the Arctic Ocean, thus significantly reducing the model's pronounced warm bias (Fig. 1). While improving the polar climate simulation in CCSM3, freezedry has virtually no influence outside of very cold regions (Fig. 2) or during summer (Fig. 3), which are space and time domains that were not targeted. Furthermore, the simplicity of this parameterization allows it to be readily incorporated into other GCMs, many of which also suffer from excessive wintertime polar cloudiness, based on the results from the CMIP3 archive (Vavrus et al., 2008). Freezedry also affects CCSM3's sensitivity to greenhouse forcing. In a transient-CO 2 experiment, the model version with freezedry warms up to 20% less in the North Polar and South Polar regions (1.5 K and 0.5 K smaller warming, respectively) (Fig. 4). Paradoxically, the muted high-latitude response occurs despite a much larger increase in cloud amount with freezedry during non-summer months (when clouds warm the surface), apparently because of the colder modern reference climate. These results of the freezedry parameterization have recently been published (Vavrus and D. Waliser, 2008: An improved parameterization for simulating Arctic cloud amount in the CCSM3

  7. Earth simulator project. Seeking a guide line for the symbiosis between the earth (Gaia) and human beings

    International Nuclear Information System (INIS)

    Tani, Keiji; Yokokawa, Mitsuo

    2000-01-01

    In 1997, a project was started to promote the foreseeing study on the changes in global environment by Science and Technology Agency and three studies; a basic process study, observations and computer simulation have started by the project. This report outlines the development of e arth simulator , an ultra high-speed computer as a part of the project. Global warming due to human activities has been progressing on the earth. It is assumed that the mean temperature increase and sea level rising would reach 2degC and 50 cm, respectively before 2100 if the present environmental states still continue in future. There are two inevitable problems in the global environment simulation on global scale; mesh size of simulator and errors in observation data as the initial values for simulation. To accurately simulate the behaviors of thundercloud several kilometers in size, which is closely related to weather disaster, it is necessary to raise the analytic resolution by decreasing the mesh size from 20-30 km to about 1 km, resulting in several hundred times increase of the number of mesh. When compared with CRAY C90, which is the representative computer in the climate and weather field, it is thought necessary to increase the capacities of memory and calculation speed by more than one thousand times. Therefore, it seems necessary to construct a new computer including a parallel type calculator because a computer of effective speed, 5TFLOPS can not be constructed with a single processor. (M.N.)

  8. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    Science.gov (United States)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  9. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    Science.gov (United States)

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  10. Consequences of simulating terrestrial N dynamics for projecting future terrestrial C storage

    Science.gov (United States)

    Zaehle, S.; Friend, A. D.; Friedlingstein, P.

    2009-04-01

    We present results of a new land surface model, O-CN, which includes a process-based coupling between the terrestrial cycling of energy, water, carbon, and nitrogen. The model represents the controls of the terrestrial nitrogen (N) cycling on carbon (C) pools and fluxes through photosynthesis, respiration, changes in allocation patterns, as well as soil organic matter decomposition, and explicitly accounts for N leaching and gaseous losses. O-CN has been shown to give realistic results in comparison to observations at a wide range of scales, including in situ flux measurements, productivity databases, and atmospheric CO2 concentration data. Notably, O-CN simulates realistic responses of net primary productivity, foliage area, and foliage N content to elevated atmospheric [CO2] as evidenced at free air carbon dioxide enrichment (FACE) sites (Duke, Oak Ridge). We re-examine earlier model-based assessments of the terrestrial C sequestration potential using a global transient O-CN simulation driven by increases in atmospheric [CO2], N deposition and climatic changes over the 21st century. We find that accounting for terrestrial N cycling about halves the potential to store C in response to increases in atmospheric CO2 concentrations; mainly due to a reduction of the net C uptake in temperate and boreal forests. Nitrogen deposition partially alleviates the effect of N limitation, but is by far not sufficient to compensate for the effect completely. These findings underline the importance of an accurate representation of nutrient limitations in future projections of the terrestrial net CO2 exchanges and therefore land-climate feedback studies.

  11. A simulation of 'schedule-cost' progress monitoring system in nuclear power project management

    International Nuclear Information System (INIS)

    Song Haitao; Huang Zhongping; Zhang Zemin; Wang Zikai

    2010-01-01

    The objective of project management is to find the optimal balance between progress and cost according to the project requirements. Traditional method always manages progress and cost separately. However, domestic and international experience indicated that the interactions between these two factors are crucial in the project implementation. Modern project managers have to manage and maintain a 'Progress - Cost' joint control framework. Such a model is applied into a sub-project of a nuclear power project using Simulink in this paper. It helps to identify and correct the deviations of the project. Earned Value Management is used by the project manager to quantify the cost of the project and progress of implementation. The budget plan value, actual value, earned value are three important parameters to measure cost and progress of the project. The experimental results illustrated that the method gives a more comprehensive performance evaluation of the project. (authors)

  12. Interannual Tropical Rainfall Variability in General Circulation Model Simulations Associated with the Atmospheric Model Intercomparison Project.

    Science.gov (United States)

    Sperber, K. R.; Palmer, T. N.

    1996-11-01

    The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979-88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. Evaluation of the interannual variability of a wind shear index over the summer monsoon region indicates that the models exhibit greater fidelity in capturing the large-scale dynamic fluctuations than the regional-scale rainfall variations. A rainfall/SST teleconnection quality control was used to objectively stratify model performance. Skill scores improved for those models that qualitatively simulated the observed rainfall/El Niño- Southern Oscillation SST correlation pattern. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations.A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany/National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall

  13. Interval Coded Scoring: a toolbox for interpretable scoring systems

    Directory of Open Access Journals (Sweden)

    Lieven Billiet

    2018-04-01

    Full Text Available Over the last decades, clinical decision support systems have been gaining importance. They help clinicians to make effective use of the overload of available information to obtain correct diagnoses and appropriate treatments. However, their power often comes at the cost of a black box model which cannot be interpreted easily. This interpretability is of paramount importance in a medical setting with regard to trust and (legal responsibility. In contrast, existing medical scoring systems are easy to understand and use, but they are often a simplified rule-of-thumb summary of previous medical experience rather than a well-founded system based on available data. Interval Coded Scoring (ICS connects these two approaches, exploiting the power of sparse optimization to derive scoring systems from training data. The presented toolbox interface makes this theory easily applicable to both small and large datasets. It contains two possible problem formulations based on linear programming or elastic net. Both allow to construct a model for a binary classification problem and establish risk profiles that can be used for future diagnosis. All of this requires only a few lines of code. ICS differs from standard machine learning through its model consisting of interpretable main effects and interactions. Furthermore, insertion of expert knowledge is possible because the training can be semi-automatic. This allows end users to make a trade-off between complexity and performance based on cross-validation results and expert knowledge. Additionally, the toolbox offers an accessible way to assess classification performance via accuracy and the ROC curve, whereas the calibration of the risk profile can be evaluated via a calibration curve. Finally, the colour-coded model visualization has particular appeal if one wants to apply ICS manually on new observations, as well as for validation by experts in the specific application domains. The validity and applicability

  14. Future Projections of Precipitation Characteristics in East Asia Simulated by the MRI CGCM2

    Institute of Scientific and Technical Information of China (English)

    Akio KITOH; Masahiro HOSAKA; Yukimasa ADACHI; Kenji KAMIGUCHI

    2005-01-01

    Projected changes in precipitation characteristics around the mid-21st century and end-of-the-century are analyzed using the daily precipitation output of the 3-member ensemble Meteorological Research Institute global ocean-atmosphere coupled general circulation model (MRI-CGCM2) simulations under the Special Report on Emissions Scenarios (SRES) A2 and B2 scenarios. It is found that both the frequency and intensity increase in about 40% of the globe, while both the frequency and intensity decrease in about 20% of the globe. These numbers differ only a few percent from decade to decade of the 21st century and between the A2 and B2 scenarios. Over the rest of the globe (about one third), the precipitation frequency decreases but its intensity increases, suggesting a shift of precipitation distribution toward more intense events by global warming. South China is such a region where the summertime wet-day frequency decreases but the precipitation intensity increases. This is related to increased atmospheric moisture content due to global warming and an intensified and more westwardly extended North Pacific subtropical anticyclone,which may be related with an El Ni(n)o-like mean sea surface temperature change. On the other hand, a decrease in summer precipitation is noted in North China, thus augmenting a south-to-north precipit ation contrast more in the future.

  15. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    International Nuclear Information System (INIS)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-01-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction

  16. Ocean response to volcanic eruptions in Coupled Model Intercomparison Project 5 simulations

    KAUST Repository

    Ding, Yanni

    2014-09-01

    We examine the oceanic impact of large tropical volcanic eruptions as they appear in ensembles of historical simulations from eight Coupled Model Intercomparison Project Phase 5 models. These models show a response that includes lowering of global average sea surface temperature by 0.1–0.3 K, comparable to the observations. They show enhancement of Arctic ice cover in the years following major volcanic eruptions, with long-lived temperature anomalies extending to the middepth and deep ocean on decadal to centennial timescales. Regional ocean responses vary, although there is some consistent hemispheric asymmetry associated with the hemisphere in which the eruption occurs. Temperature decreases and salinity increases contribute to an increase in the density of surface water and an enhancement in the overturning circulation of the North Atlantic Ocean following these eruptions. The strength of this overturning increase varies considerably from model to model and is correlated with the background variability of overturning in each model. Any cause/effect relationship between eruptions and the phase of El Niño is weak.

  17. Authoring Tools for Simulation-Based CBT--an Interim Project Report. Occasional Paper InTER/13/89.

    Science.gov (United States)

    Lewis, R., Ed.; Mace, Terry, Ed.

    This paper describes a project which aims to create an authoring environment for simulation-based training in order to illustrate features necessary for: (1) an environment which can be used by teachers/trainers with minimum technical skills; (2) rapid prototyping by all courseware producers; (3) and tools for model building. The paper begins with…

  18. An exploration of the option space in student design projects for uncertainty and sensitivity analysis with performance simulation

    NARCIS (Netherlands)

    Struck, C.; Wilde, de P.J.C.J.; Hopfe, C.J.; Hensen, J.L.M.

    2008-01-01

    This paper describes research conducted to gather empirical evidence on extent, character and content of the option space in building design projects, from the perspective of a climate engineer using building performance simulation for concept evaluation. The goal is to support uncertainty analysis

  19. Combining integrated river modelling and agent based social simulation for river management; The case study of the Grensmaas project

    NARCIS (Netherlands)

    Valkering, P.; Krywkow, Jorg; Rotmans, J.; van der Veen, A.; Douben, N.; van Os, A.G.

    2003-01-01

    In this paper we present a coupled Integrated River Model – Agent Based Social Simulation model (IRM-ABSS) for river management. The models represent the case of the ongoing river engineering project “Grensmaas”. In the ABSS model stakeholders are represented as computer agents negotiating a river

  20. SRV: an open-source toolbox to accelerate the recovery of metabolic biomarkers and correlations from metabolic phenotyping datasets.

    Science.gov (United States)

    Navratil, Vincent; Pontoizeau, Clément; Billoir, Elise; Blaise, Benjamin J

    2013-05-15

    Supervised multivariate statistical analyses are often required to analyze the high-density spectral information in metabolic datasets acquired from complex mixtures in metabolic phenotyping studies. Here we present an implementation of the SRV-Statistical Recoupling of Variables-algorithm as an open-source Matlab and GNU Octave toolbox. SRV allows the identification of similarity between consecutive variables resulting from the high-resolution bucketing. Similar variables are gathered to restore the spectral dependency within the datasets and identify metabolic NMR signals. The correlation and significance of these new NMR variables for a given effect under study can then be measured and represented on a loading plot to allow a visual and efficient identification of candidate biomarkers. Further on, correlations between these candidate biomarkers can be visualized on a two-dimensional pseudospectrum, representing a correlation map, helping to understand the modifications of the underlying metabolic network. SRV toolbox is encoded in MATLAB R2008A (Mathworks, Natick, MA) and in GNU Octave. It is available free of charge at http://www.prabi.fr/redmine/projects/srv/repository with a tutorial. benjamin.blaise@chu-lyon.fr or vincent.navratil@univ-lyon1.fr.

  1. Rangeland Brush Estimation Toolbox (RaBET): An Approach for Evaluating Brush Management Conservation Efforts in Western Grazing Lands

    Science.gov (United States)

    Holifield Collins, C.; Kautz, M. A.; Skirvin, S. M.; Metz, L. J.

    2016-12-01

    There are over 180 million hectares of rangelands and grazed forests in the central and western United States. Due to the loss of perennial grasses and subsequent increased runoff and erosion that can degrade the system, woody cover species cannot be allowed to proliferate unchecked. The USDA-Natural Resources Conservation Service (NRCS) has allocated extensive resources to employ brush management (removal) as a conservation practice to control woody species encroachment. The Rangeland-Conservation Effects Assessment Project (CEAP) has been tasked with determining how effective the practice has been, however their land managers lack a cost-effective means to conduct these assessments at the necessary scale. An ArcGIS toolbox for generating large-scale, Landsat-based, spatial maps of woody cover on grazing lands in the western United States was developed through a collaboration with NRCS Rangeland-CEAP. The toolbox contains two main components of operation, image generation and temporal analysis, and utilizes simple interfaces requiring minimum user inputs. The image generation tool utilizes geographically specific algorithms developed from combining moderate-resolution (30-m) Landsat imagery and high-resolution (1-m) National Agricultural Imagery Program (NAIP) aerial photography to produce the woody cover scenes at the Major Land Resource (MLRA) scale. The temporal analysis tool can be used on these scenes to assess treatment effectiveness and monitor woody cover reemergence. RaBET provides rangeland managers an operational, inexpensive decision support tool to aid in the application of brush removal treatments and assessing their effectiveness.

  2. Using an ensemble of climate projections for simulating recent and near-future hydrological change to lake Vaenern in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Olsson, Jonas; Yang, Wei; Graham, L. Phil; Rosberg, Joergen; Andreasson, Johan (Swedish Meteorological and Hydrological Inst., Norrkoeping (Sweden)), e-mail: jonas.olsson@smhi.se

    2011-01-15

    Lake Vaenern and River Goeta aelv in southern Sweden constitute a large and complex hydrological system that is highly vulnerable to climate change. In this study, an ensemble of 12 regional climate projections is used to simulate the inflow to Lake Vaenern by the HBV hydrological model. By using distribution based scaling of the climate model output, all projections can accurately reproduce the annual cycle of mean monthly inflows for the period 1961-1990 as simulated using HBV with observed temperature and precipitation ('HBVobs'). Significant changes towards higher winter inflow and a reduced spring flood were found when comparing the period 1991-2008 to 1961-1990 in the HBVobs simulations and the ability of the regional projections to reproduce these changes varied. The main uncertainties in the projections for 1991-2008 were found to originate from the global climate model used, including its initialization, and in one case, the emissions scenario, whereas the regional climate model used and its resolution showed a smaller influence. The projections that most accurately reproduce the recent change suggest that the current trends in the winter and spring inflows will continue over the period 2009-2030

  3. Evaluating IPCC AR4 cool-season precipitation simulations and projections for impacts assessment over North America

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, Stephanie A. [The University of Arizona, Department of Geosciences, Tucson, AZ (United States); The Wilderness Society, Anchorage, AK (United States); Russell, Joellen L.; Goodman, Paul J. [The University of Arizona, Department of Geosciences, Tucson, AZ (United States)

    2011-12-15

    General circulation models (GCMs) have demonstrated success in simulating global climate, and they are critical tools for producing regional climate projections consistent with global changes in radiative forcing. GCM output is currently being used in a variety of ways for regional impacts projection. However, more work is required to assess model bias and evaluate whether assumptions about the independence of model projections and error are valid. This is particularly important where models do not display offsetting errors. Comparing simulated 300-hPa zonal winds and precipitation for the late 20th century with reanalysis and gridded precipitation data shows statistically significant and physically plausible associations between positive precipitation biases across all models and a marked increase in zonal wind speed around 30 N, as well as distortions in rain shadow patterns. Over the western United States, GCMs project drier conditions to the south and increasing precipitation to the north. There is a high degree of agreement between models, and many studies have made strong statements about implications for water resources and about ecosystem change on that basis. However, since one of the mechanisms driving changes in winter precipitation patterns appears to be associated with a source of error in simulating mean precipitation in the present, it suggests that greater caution should be used in interpreting impacts related to precipitation projections in this region and that standard assumptions underlying bias correction methods should be scrutinized. (orig.)

  4. ESTEEM - managing societal acceptance in new energy projects : a toolbox method for project managers

    NARCIS (Netherlands)

    Raven, R.P.J.M.; Jolivet, E.; Mourik, R.M.; Feenstra, C.F.J.

    2009-01-01

    There is now a large literature dealing with the policy question of public participation in technical choice and technology assessment (TA). Files such as the mad cow crisis, genetically modified food, and the emerging nanotechnologies have been edified into a public problem, and have given place to

  5. DYNAMIC SYSTEM ANALYSIS WITH pplane8.m (MATLAB® toolbox

    Directory of Open Access Journals (Sweden)

    Alejandro Regalado-Méndez

    2013-12-01

    Full Text Available In this work, four dynamic systems were analyzed (physical, chemical, ecological and economical, represented by autonomous systems of two ordinary differential equations. Main objective is proving that pplane8.m is an adequate and efficient computer tool. The analysis of autonomous systems was given by characterization of their critical points according to Coughanowr & LeBlanc (2009, with the MATLAB® toolbox pplane8.m. The main results are that pplane8.m (Polking, 2009 can quickly and precisely draw trajectories of each phase plane, it easily computes each critical point, and correctly characterize each equilibrium point of all autonomous studied systems. Finally, we can say that pplane8.m is a powerful tool to help the teaching-learning process for engineering students.

  6. The influence of industrial applications on a control system toolbox

    International Nuclear Information System (INIS)

    Clout, P.

    1992-01-01

    Vsystem is as an open, advanced software application toolbox for rapidly creating fast, efficient and cost-effective control and data-acquisition systems. Vsystem's modular architecture is designed for single computers, networked computers and workstations running under VAX/VMS or VAX/ELN. At the heart of Vsystem lies Vaccess, a user extendible real-time database and library of access routines. The application database provides the link to the hardware of the application and can be organized as one database or separate database installed in different computers on the network. Vsystem has found application in charged-particle accelerator control, tokamak control, and industrial research, as well as its more recent industrial applications. This paper describes the broad feature of Vsystem and the influence that recent industrial applications have had on the software. (author)

  7. Matlab Stability and Control Toolbox: Trim and Static Stability Module

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.

    2006-01-01

    This paper presents the technical background of the Trim and Static module of the Matlab Stability and Control Toolbox. This module performs a low-fidelity stability and control assessment of an aircraft model for a set of flight critical conditions. This is attained by determining if the control authority available for trim is sufficient and if the static stability characteristics are adequate. These conditions can be selected from a prescribed set or can be specified to meet particular requirements. The prescribed set of conditions includes horizontal flight, take-off rotation, landing flare, steady roll, steady turn and pull-up/ push-over flight, for which several operating conditions can be specified. A mathematical model was developed allowing for six-dimensional trim, adjustable inertial properties, asymmetric vehicle layouts, arbitrary number of engines, multi-axial thrust vectoring, engine(s)-out conditions, crosswind and gyroscopic effects.

  8. Peptide chemistry toolbox - Transforming natural peptides into peptide therapeutics.

    Science.gov (United States)

    Erak, Miloš; Bellmann-Sickert, Kathrin; Els-Heindl, Sylvia; Beck-Sickinger, Annette G

    2018-06-01

    The development of solid phase peptide synthesis has released tremendous opportunities for using synthetic peptides in medicinal applications. In the last decades, peptide therapeutics became an emerging market in pharmaceutical industry. The need for synthetic strategies in order to improve peptidic properties, such as longer half-life, higher bioavailability, increased potency and efficiency is accordingly rising. In this mini-review, we present a toolbox of modifications in peptide chemistry for overcoming the main drawbacks during the transition from natural peptides to peptide therapeutics. Modifications at the level of the peptide backbone, amino acid side chains and higher orders of structures are described. Furthermore, we are discussing the future of peptide therapeutics development and their impact on the pharmaceutical market. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Improving image quality in Electrical Impedance Tomography (EIT using Projection Error Propagation-based Regularization (PEPR technique: A simulation study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-03-01

    Full Text Available A Projection Error Propagation-based Regularization (PEPR method is proposed and the reconstructed image quality is improved in Electrical Impedance Tomography (EIT. A projection error is produced due to the misfit of the calculated and measured data in the reconstruction process. The variation of the projection error is integrated with response matrix in each iterations and the reconstruction is carried out in EIDORS. The PEPR method is studied with the simulated boundary data for different inhomogeneity geometries. Simulated results demonstrate that the PEPR technique improves image reconstruction precision in EIDORS and hence it can be successfully implemented to increase the reconstruction accuracy in EIT.>doi:10.5617/jeb.158 J Electr Bioimp, vol. 2, pp. 2-12, 2011

  10. SHynergie: Development of a virtual project laboratory for monitoring hydraulic stimulations

    Science.gov (United States)

    Renner, Jörg; Friederich, Wolfgang; Meschke, Günther; Müller, Thomas; Steeb, Holger

    2016-04-01

    Hydraulic stimulations are the primary means of developing subsurface reservoirs regarding the extent of fluid transport in them. The associated creation or conditioning of a system of hydraulic conduits involves a range of hydraulic and mechanical processes but also chemical reactions, such as dissolution and precipitation, may affect the stimulation result on time scales as short as hours. In the light of the extent and complexity of these processes, the steering potential for the operator of a stimulation critically depends on the ability to integrate the maximum amount of site-specific information with profound process understanding and a large spectrum of experience. We report on the development of a virtual project laboratory for monitoring hydraulic stimulations within the project SHynergie (http://www.ruhr-uni-bochum.de/shynergie/). The concept of the laboratory envisioned product that constitutes a preparing and accompanying rather than post-processing instrument ultimately accessible to persons responsible for a project over a web-repository. The virtual laboratory consists of a data base, a toolbox, and a model-building environment. Entries in the data base are of two categories. On the one hand, selected mineral and rock properties are provided from the literature. On the other hand, project-specific entries of any format can be made that are assigned attributes regarding their use in a stimulation problem at hand. The toolbox is interactive and allows the user to perform calculations of effective properties and simulations of different types (e.g., wave propagation in a reservoir, hydraulic test). The model component is also hybrid. The laboratory provides a library of models reflecting a range of scenarios but also allows the user to develop a site-specific model constituting the basis for simulations. The laboratory offers the option to use its components following the typical workflow of a stimulation project. The toolbox incorporates simulation

  11. Comparison of projection skills of deterministic ensemble methods using pseudo-simulation data generated from multivariate Gaussian distribution

    Science.gov (United States)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2017-07-01

    The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.

  12. Expanding access to rheumatology care: the rheumatology general practice toolbox.

    LENUS (Irish Health Repository)

    Conway, R

    2015-02-01

    Management guidelines for many rheumatic diseases are published in specialty rheumatology literature but rarely in general medical journals. Musculoskeletal disorders comprise 14% of all consultations in primary care. Formal post-graduate training in rheumatology is limited or absent for many primary care practitioners. Primary care practitioners can be trained to effectively treat complex diseases and have expressed a preference for interactive educational courses. The Rheumatology General Practice (GP) Toolbox is an intensive one day course designed to offer up to date information to primary care practitioners on the latest diagnostic and treatment guidelines for seven common rheumatic diseases. The course structure involves a short lecture on each topic and workshops on arthrocentesis, joint injection and DXA interpretation. Participants evaluated their knowledge and educational experience before, during and after the course. Thirty-two primary care practitioners attended, who had a median of 13 (IQR 6.5, 20) years experience in their specialty. The median number of educational symposia attended in the previous 5 years was 10 (IQR-5, 22.5), with a median of 0 (IQR 0, 1) in rheumatology. All respondents agreed that the course format was appropriate. Numerical improvements were demonstrated in participant\\'s confidence in diagnosing and managing all seven common rheumatologic conditions, with statistically significant improvements (p < 0.05) in 11 of the 14 aspects assessed. The Rheumatology Toolbox is an effective educational method for disseminating current knowledge in rheumatology to primary care physicians and improved participant\\'s self-assessed competence in diagnosis and management of common rheumatic diseases.

  13. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  14. Molecular Cloning Designer Simulator (MCDS: All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects

    Directory of Open Access Journals (Sweden)

    Zhenyu Shi

    2016-12-01

    Full Text Available Molecular Cloning Designer Simulator (MCDS is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1 it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2 it can perform a user-defined workflow of cloning steps in a single execution of the software; (3 it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4 it includes experimental information to conveniently guide wet lab work; and (5 it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com. Keywords: BioCAD, Genetic engineering software, Molecular cloning software, Synthetic biology, Workflow simulation and management

  15. SIROCCO project: 15 advanced instructor desk and 4 simulated control room for 900MW and 1300MW EDF power plant simulators

    International Nuclear Information System (INIS)

    Alphonse, J.; Roth, P.; Sicard, Y.; Rudelli, P.

    2006-01-01

    This presentation describes the fifteen advanced instructors station and four simulated control delivered to EDF in the frame of the SIROCCO project by the Consortium formed by ATOS Origin, CORYS Tess, for the Electricite de France (EDF). These instructor stations are installed on fifteen replica training simulators located on different sites throughout France for the purposes of improving the job-related training of the EDF PWR nuclear power plant operating teams. This covers all 900 MW and 1300MW nuclear power plant of EDF. The simulated control rooms are installed on maintenance platform located at EDF and the consortium facilities. The consortium uses it to maintain and upgrade the simulators. EDF uses it to validate the upgrade delivered by the consortium before on site installation and to perform engineering analysis. This presentation sets out successively: - The major advantages of the generic and configurable connected module concept for flexible and quick adaptation to different simulators; - The innovative functionalities of the advanced Instructor Desk (IS) which make the instructor's tasks of preparation, monitoring and postanalysis of a training session easier and more homogeneous; - The use of the Simulated Control Room (SCR) for training purposes but also for those of maintenance and design studies for upgrades of existing control rooms

  16. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Y [UT Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou (China); Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou (China)

    2015-06-15

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  17. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    International Nuclear Information System (INIS)

    Xu, Y; Tian, Z; Jiang, S; Jia, X; Zhou, L

    2015-01-01

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  18. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Aarle, Wim van, E-mail: wim.vanaarle@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Palenstijn, Willem Jan, E-mail: willemjan.palenstijn@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); De Beenhouwer, Jan, E-mail: jan.debeenhouwer@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Altantzis, Thomas, E-mail: thomas.altantzis@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Bals, Sara, E-mail: sara.bals@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Batenburg, K. Joost, E-mail: joost.batenburg@cwi.nl [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); Mathematical Institute, Leiden University, P.O. Box 9512, NL-2300 RA Leiden (Netherlands); Sijbers, Jan, E-mail: jan.sijbers@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-10-15

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series.

  19. Reinforcement Toolbox, a Parametric Reinforcement Modelling Tool for Curved Surface Structures

    NARCIS (Netherlands)

    Lauppe, J.; Rolvink, A.; Coenders, J.L.

    2013-01-01

    This paper presents a computational strategy and parametric modelling toolbox which aim at enhancing the design- and production process of reinforcement in freeform curved surface structures. The computational strategy encompasses the necessary steps of raising an architectural curved surface model

  20. A microfluidic toolbox for the development of in-situ product removal strategies in biocatalysis

    DEFF Research Database (Denmark)

    Heintz, Søren; Mitic, Aleksandar; Ringborg, Rolf Hoffmeyer

    2016-01-01

    A microfluidic toolbox for accelerated development of biocatalytic processes has great potential. This is especially the case for the development of advanced biocatalytic process concepts, where reactors and product separation methods are closely linked together to intensify the process performan...

  1. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    Science.gov (United States)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  2. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Directory of Open Access Journals (Sweden)

    David M. Zalk

    2011-06-01

    Conclusion: The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  3. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    International Nuclear Information System (INIS)

    Aarle, Wim van; Palenstijn, Willem Jan; De Beenhouwer, Jan; Altantzis, Thomas; Bals, Sara; Batenburg, K. Joost; Sijbers, Jan

    2015-01-01

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series

  4. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  5. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  6. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    International Nuclear Information System (INIS)

    2007-01-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel (Journal of Fusion Energy 20, 135 (2001)) recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts (Journal of Fusion Energy 23, 1 (2004)). The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  7. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    International Nuclear Information System (INIS)

    Kritz, A.; Keyes, D.

    2007-01-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007

  8. Computer simulating observations of the Lunar physical libration for the Japanese Lunar project ILOM

    Science.gov (United States)

    Petrova, Natalia; Hanada, Hideo

    2010-05-01

    In the frame of the second stage of the Japanese space mission SELENE-2 (Hanada et al. 2009) the project ILOM (In-situ Lunar Orientation Measurement) planned after 2017years is a kind of instrument for positioning on the Moon. It will be set near the lunar pole and will determine parameters of lunar physical libration by positioning of several tens of stars in the field of view regularly for longer than one year. Presented work is dedicated to analyses of computer simulating future observations. It's proposed that for every star crossing lunar prime meridian its polar distance will be to measure. The methods of optimal star observation are being developed for the future experiment. The equations are constructed to determine libration angles ? (t),ρ(t),σ(t)- on the basis of observed polar distances pobs: (| f1(?,ρ,Iσ,pobs) = 0 |{ f2(?,ρ,Iσ,pobs) = 0 | f3(?,ρ,Iσ,pobs) = 0 |( or f(X) = 0, where ; f = ? f1 ? | f2 | |? f3 |? X = ? ? ? | ρ | |? Iσ |? (1) At the present stage we have developed the software for selection of stars for these future polar observations. Stars were taken from various stellar catalogues, such as the UCAC2-BSS, Hipparcos, Tycho and FK6. The software reduces ICRS coordinates of star to selenographical system at the epoch of observation (Petrova et al., 2009). For example, to the epochs 2017 - 2018 more than 50 stars brighter than m = 12 were selected for the northern pole. In total, these stars give about 600 crossings of the prime meridian during one year. Nevertheless, only a few stars (2-5) may be observed in a vicinity of the one moment. This is not enough to have sufficient sample to exclude various kind of errors. The software includes programmes which can determine the moment of transition of star across the meridian and theoretical values of libration angles at this moments. A serious problem arises when we try to solve equations (1) with the purpose to determine libration angles on the basis of simulated pobs.. Polar distances

  9. Non-stationary Condition Monitoring of large diesel engines with the AEWATT toolbox

    DEFF Research Database (Denmark)

    Pontoppidan, Niels Henrik; Larsen, Jan; Sigurdsson, Sigurdur

    2005-01-01

    We are developing a specialized toolbox for non-stationary condition monitoring of large 2-stroke diesel engines based on acoustic emission measurements. The main contribution of this toolbox has so far been the utilization of adaptive linear models such as Principal and Independent Component Ana......, the inversion of those angular timing changes called “event alignment”, has allowed for condition monitoring across operation load settings, successfully enabling a single model to be used with realistic data under varying operational conditions-...

  10. Intelligent power plant simulator for educational purposes

    International Nuclear Information System (INIS)

    Seifi, A.; Seifi, H.; Ansari, M. R.; Parsa Moghaddam, M.

    2001-01-01

    An Intelligent Tutoring System can be effectively employed for a power plant simulator so that the need for instructor in minimized. In this paper using the above concept as well as object oriented programming and SIMULINK Toolbox of MATLAB, an intelligent tutoring power plant simulator is proposed. Its successful application on a typical 11 MW power plant is demonstrated

  11. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  12. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  13. On the influence of simulated SST warming on rainfall projections in the Indo-Pacific domain: an AGCM study

    Science.gov (United States)

    Zhang, Huqiang; Zhao, Y.; Moise, A.; Ye, H.; Colman, R.; Roff, G.; Zhao, M.

    2018-02-01

    Significant uncertainty exists in regional climate change projections, particularly for rainfall and other hydro-climate variables. In this study, we conduct a series of Atmospheric General Circulation Model (AGCM) experiments with different future sea surface temperature (SST) warming simulated by a range of coupled climate models. They allow us to assess the extent to which uncertainty from current coupled climate model rainfall projections can be attributed to their simulated SST warming. Nine CMIP5 model-simulated global SST warming anomalies have been super-imposed onto the current SSTs simulated by the Australian climate model ACCESS1.3. The ACCESS1.3 SST-forced experiments closely reproduce rainfall means and interannual variations as in its own fully coupled experiments. Although different global SST warming intensities explain well the inter-model difference in global mean precipitation changes, at regional scales the SST influence vary significantly. SST warming explains about 20-25% of the patterns of precipitation changes in each of the four/five models in its rainfall projections over the oceans in the Indo-Pacific domain, but there are also a couple of models in which different SST warming explains little of their precipitation pattern changes. The influence is weaker again for rainfall changes over land. Roughly similar levels of contribution can be attributed to different atmospheric responses to SST warming in these models. The weak SST influence in our study could be due to the experimental setup applied: superimposing different SST warming anomalies onto the same SSTs simulated for current climate by ACCESS1.3 rather than directly using model-simulated past and future SSTs. Similar modelling and analysis from other modelling groups with more carefully designed experiments are needed to tease out uncertainties caused by different SST warming patterns, different SST mean biases and different model physical/dynamical responses to the same underlying

  14. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    Directory of Open Access Journals (Sweden)

    Rasmus Magnusson

    2017-06-01

    Full Text Available Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM, which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE for gene regulatory networks (GRNs. LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models

  15. Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox

    Science.gov (United States)

    Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.

    2017-10-01

    Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.

  16. The Triticeae Toolbox: Combining Phenotype and Genotype Data to Advance Small-Grains Breeding

    Directory of Open Access Journals (Sweden)

    Victoria C. Blake

    2016-07-01

    Full Text Available The Triticeae Toolbox (; T3 is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP. T3 enables users to define specific data sets for download in formats compatible with the external tools TASSEL, Flapjack, and R; or to use by software residing on the T3 server for operations such as Genome Wide Association and Genomic Prediction. New T3 tools to assist plant breeders include a Selection Index Generator, analytical tools to compare phenotype trials using common or user-defined indices, and a histogram generator for nursery reports, with applications using the Android OS, and a Field Plot Layout Designer in development. Researchers using T3 will soon enjoy the ability to design training sets, define core germplasm sets, and perform multivariate analysis. An increased collaboration with GrainGenes and integration with the small grains reference sequence resources will place T3 in a pivotal role for on-the-fly data analysis, with instant access to the knowledge databases for wheat and barley. T3 software is available under the GNU General Public License and is freely downloadable.

  17. Preliminary simulation of degassing of natural gases dissolved in groundwater during shaft excavation in Horonobe underground research project

    International Nuclear Information System (INIS)

    Yamamoto, Hajime; Shimo, Michito; Kunimaru, Takanori; Kurikami, Hiroshi

    2007-01-01

    In Neogene-Quaternary sedimentary basins, natural gases such as methane are often dissolved in groundwater significantly. In this paper, two-phase flow simulations incorporating the degassing of methane, and carbon dioxide, were performed for the shaft excavation in Horonobe underground research project. The results drawn from the simulations are summarized as follows. 1) As depth increases, degassing and gas inflow occurs significantly. 2) Degassing increases the compressibility of pore fluids, resulting in slow changes in groundwater pressures. 3) Although the occurrence of gas phase decreases water mobility, the influence of the dissolved gas on the groundwater inflow rate to the shaft was small. (author)

  18. Toolbox for Research and Exploration (TREX): Investigations of Fine-Grained Materials on Small Bodies

    Science.gov (United States)

    Domingue, D. L.; Allain, J.-P.; Banks, M.; Christoffersen, R.; Cintala, M.; Clark, R.; Cloutis, E.; Graps, A.; Hendrix, A. R.; Hsieh, H.; hide

    2018-01-01

    The Toolbox for Research and Exploration (TREX) is a NASA SSERVI (Solar System Exploration Research Virtual Institute) node. TREX (trex.psi.edu) aims to decrease risk to future missions, specifically to the Moon, the Martian moons, and near- Earth asteroids, by improving mission success and assuring the safety of astronauts, their instruments, and spacecraft. TREX studies will focus on characteristics of the fine grains that cover the surfaces of these target bodies - their spectral characteristics and the potential resources (such as H2O) they may harbor. TREX studies are organized into four Themes (Laboratory- Studies, Moon-Studies, Small-Bodies Studies, and Field-Work). In this presentation, we focus on the work targeted by the Small-Bodies Theme. The Small-Bodies' Theme delves into several topics, many which overlap or are synergistic with the other TREX Themes. The main topics include photometry, spectral modeling, laboratory simulations of space weathering processes relevant to asteroids, the assembly of an asteroid regolith database, the dichotomy between nuclear and reflectance spectroscopy, and the dynamical evolution of asteroids and the implications for the retention of volatiles.

  19. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    Science.gov (United States)

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  20. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  1. Simulation and Prediction of Groundwater Pollution from Planned Feed Additive Project in Nanning City Based on GMS Model

    Science.gov (United States)

    Liang, Yimin; Lan, Junkang; Wen, Zhixiong

    2018-01-01

    In order to predict the pollution of underground aquifers and rivers by the proposed project, Specialized hydrogeological investigation was carried out. After hydrogeological surveying and mapping, drilling, and groundwater level monitoring, the scope of the hydrogeological unit and the regional hydrogeological condition were found out. The permeability coefficients of the aquifers were also obtained by borehole water injection tests. In order to predict the impact on groundwater environment by the project, a GMS software was used in numerical simulation. The simulation results show that when unexpected sewage leakage accident happened, the pollutants will be gradually diluted by groundwater, and the diluted contaminants will slowly spread to southeast with groundwater flow, eventually they are discharged into Gantang River. However, the process of the pollutants discharging into the river is very long, the long-term dilution of the river water will keep Gantang River from being polluted.

  2. Versatile Cas9-Driven Subpopulation Selection Toolbox for Lactococcus lactis.

    Science.gov (United States)

    van der Els, Simon; James, Jennelle K; Kleerebezem, Michiel; Bron, Peter A

    2018-04-15

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis The Cas9 expression vector pLABTarget, encoding the Streptocccus pyogenes Cas9 under the control of a constitutive promoter, was constructed, allowing plug and play introduction of short guide RNA (sgRNA) sequences to target specific genetic loci. Introduction of a pepN -targeting derivative of pLABTarget into L. lactis strain MG1363 led to a strong reduction in the number of transformants obtained, which did not occur in a pepN deletion derivative of the same strain, demonstrating the specificity and lethality of the Cas9-mediated double-strand breaks in the lactococcal chromosome. Moreover, the same pLABTarget derivative allowed the selection of a pepN deletion subpopulation from its corresponding single-crossover plasmid integrant precursor, accelerating the construction and selection of gene-specific deletion derivatives in L. lactis Finally, pLABTarget, which contained sgRNAs designed to target mobile genetic elements, allowed the effective curing of plasmids, prophages, and integrative conjugative elements (ICEs). These results establish that pLABTarget enables the effective exploitation of Cas9 targeting in L. lactis , while the broad-host-range vector used suggests that this toolbox could readily be expanded to other Gram-positive bacteria. IMPORTANCE Mobile genetic elements in Lactococcus lactis and other lactic acid bacteria (LAB) play an important role in dairy fermentation, having both positive and detrimental effects during the production of fermented dairy products. The pLABTarget vector offers an efficient cloning platform for Cas9 application in lactic acid bacteria. Targeting Cas9 toward mobile genetic elements enabled their effective curing, which is of particular interest in the

  3. European Studies and Public Engagement: A Conceptual Toolbox

    Directory of Open Access Journals (Sweden)

    Andreas Müllerleile

    2014-11-01

    Full Text Available Journal of Contemporary European Research User Username Password Remember me Subscribe... Sign up for issue alerts Follow JCER on Twitter Font Size Make font size smaller Make font size default Make font size larger Journal Content Search Search Scope Browse By Issue By Author By Title Information For Readers For Authors For Librarians Journal Help Keywords CFSP Communication ESDP EU EU enlargement EU trade policy Energy, EU, External Policy Europe European Commission European Parliament European Union European integration Europeanisation First Enlargement Germany Liberty Lisbon Treaty Poland Russia Security teaching European studies The UACES Blog The Commission after the 2014 EP... Power shift? The EU’s pivot to Asia 100 Books on Europe to be Remembered For a Global European Studies? EU Member State Building in the... Open Journal Systems Home About Login Register Search Current Archives Announcements UACES Home > Vol 10, No 4 (2014 > Müllerleile European Studies and Public Engagement: A Conceptual Toolbox Andreas Müllerleile Abstract This article examines public engagement strategies for academics working in the field of European Studies. Should academics engage with the public? What are the most effective outreach strategies? And what are the implications for universities and departments? The article argues that engaging with the public should be considered an integral part for academics working on topics that relate to the European Union or European politics. The article has a theoretical and a practical dimension. The first part of the paper deals with the nature of public engagement, explaining why it is an important issue and how it differs from the mainstream understanding of public engagement. The practical part of the paper presents the idea of building an online presence through which academics can engage with the public debate both during periods of low issue salience and high issue salience. The final section includes a toolbox

  4. The Cea multi-scale and multi-physics simulation project for nuclear applications

    International Nuclear Information System (INIS)

    Ledermann, P.; Chauliac, C.; Thomas, J.B.

    2005-01-01

    Full text of publication follows. Today numerical modelling is everywhere recognized as an essential tool of capitalization, integration and share of knowledge. For this reason, it becomes the central tool of research. Until now, the Cea developed a set of scientific software allowing to model, in each situation, the operation of whole or part of a nuclear installation and these codes are largely used in nuclear industry. However, for the future, it is essential to aim for a better accuracy, a better control of uncertainties and better performance in computing times. The objective is to obtain validated models allowing accurate predictive calculations for actual complex nuclear problems such as fuel behaviour in accidental situation. This demands to master a large and interactive set of phenomena ranging from nuclear reaction to heat transfer. To this end, Cea, with industrial partners (EDF, Framatome-ANP, ANDRA) has designed an integrated platform of calculation, devoted to the study of nuclear systems, and intended at the same time for industries and scientists. The development of this platform is under way with the start in 2005 of the integrated project NURESIM, with 18 European partners. Improvement is coming not only through a multi-scale description of all phenomena but also through an innovative design approach requiring deep functional analysis which is upstream from the development of the simulation platform itself. In addition, the studies of future nuclear systems are increasingly multidisciplinary (simultaneous modelling of core physics, thermal-hydraulics and fuel behaviour). These multi-physics and multi-scale aspects make mandatory to pay very careful attention to software architecture issues. A global platform is thus developed integrating dedicated specialized platforms: DESCARTES for core physics, NEPTUNE for thermal-hydraulics, PLEIADES for fuel behaviour, SINERGY for materials behaviour under irradiation, ALLIANCES for the performance

  5. Outline of ATE Novoronezh Training Centre simulator V-187 upgrade project

    International Nuclear Information System (INIS)

    2006-01-01

    Full-scope simulator V-187 was put into operation in 1987. Novovoronezh NPP Unit 5 is the reference unit for the simulator. The following is included into the simulator computer engineering: - central computer IBM 4381 to calculate the simulator mathematical models and to maintain the Instructor's Workstation; - 2 computers CM-2M to control the graphical information display subsystem; - 3 computers (object link computing terminal) to control the analog and discrete information input-output; - specially designed devices of discrete information compression to connect OLCT computers with control panel; - alpha-digital display of IBM 4381 computer is used as Instructor's Work-station. Computer languages: FORTRAN, Assembler

  6. Toolbox for the design of LiNbO3-based passive and active integrated quantum circuits

    Science.gov (United States)

    Sharapova, P. R.; Luo, K. H.; Herrmann, H.; Reichelt, M.; Meier, T.; Silberhorn, C.

    2017-12-01

    We present and discuss perspectives of current developments on advanced quantum optical circuits monolithically integrated in the lithium niobate platform. A set of basic components comprising photon pair sources based on parametric down conversion (PDC), passive routing elements and active electro-optically controllable switches and polarisation converters are building blocks of a toolbox which is the basis for a broad range of diverse quantum circuits. We review the state-of-the-art of these components and provide models that properly describe their performance in quantum circuits. As an example for applications of these models we discuss design issues for a circuit providing on-chip two-photon interference. The circuit comprises a PDC section for photon pair generation followed by an actively controllable modified mach-Zehnder structure for observing Hong-Ou-Mandel interference. The performance of such a chip is simulated theoretically by taking even imperfections of the properties of the individual components into account.

  7. Solar cooling. Dynamic computer simulations and parameter variations; Solare Kuehlung. Dynamische Rechnersimulationen und Parametervariationen

    Energy Technology Data Exchange (ETDEWEB)

    Adam, Mario; Lohmann, Sandra [Fachhochschule Duesseldorf (Germany). E2 - Erneuerbare Energien und Energieeffizienz

    2011-05-15

    The research project 'Solar cooling in the Hardware-in-the-Loop-Test' is funded by the BMBF and deals with the modeling of a pilot plant for solar cooling with the 17.5 kW absorption chiller of Yazaki in the simulation environment of MATLAB/ Simulink with the toolboxes Stateflow and CARNOT. Dynamic simulations and parameter variations according to the work-efficient methodology of design of experiments are used to select meaningful system configurations, control strategies and dimensioning of the components. The results of these simulations will be presented and a view of the use of acquired knowledge for the planned laboratory field tests on a hardware-in-the-loop test stand will be given. (orig.)

  8. Historical simulations and climate change projections over India by NCAR CCSM4: CMIP5 vs. NEX-GDDP

    Science.gov (United States)

    Sahany, Sandeep; Mishra, Saroj Kanta; Salunke, Popat

    2018-03-01

    A new bias-corrected statistically downscaled product, namely, the NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP), has recently been developed by NASA to help the scientific community in climate change impact studies at local to regional scale. In this work, the product is validated over India and its added value as compared to its CMIP5 counterpart for the NCAR CCSM4 model is analyzed, followed by climate change projections under the RCP8.5 global warming scenario using the two datasets for the variables daily maximum 2-m air temperature (Tmax), daily minimum 2-m air temperature (Tmin), and rainfall. It is found that, overall, the CCSM4-NEX-GDDP significantly reduces many of the biases in CCSM4-CMIP5 for the historical simulations; however, some biases such as the significant overestimation in the frequency of occurrence in the lower tail of the Tmax and Tmin still remain. In regard to rainfall, an important value addition in CCSM4-NEX-GDDP is the alleviation of the significant underestimation of rainfall extremes found in CCSM4-CMIP5. The projected Tmax from CCSM4-NEX-GDDP are in general higher than that projected by CCSM4-CMIP5, suggesting that the risks of heat waves and very hot days could be higher than that projected by the latter. CCSM4-NEX-GDDP projects the frequency of occurrence of the upper extreme values of historical Tmax to increase by a factor of 100 towards the end of century (as opposed to a factor of 10 increase projected by CCSM4-CMIP5). In regard to rainfall, both CCSM4-CMIP5 and CCSM4-NEX-GDDP project an increase in annual rainfall over India under the RCP8.5 global warming scenario progressively from the near term through the far term. However, CCSM4-NEX-GDDP consistently projects a higher magnitude of increase and over a larger area as compared to that projected by CCSM4-CMIP5. Projected daily rainfall distributions from CCSM4-CMIP5 and CCSM4-NEX-GDDP suggest the occurrence of events that have no historical precedents

  9. Integrating Portfolio Management and Simulation Concepts in the ERP Project Estimation Practice

    NARCIS (Netherlands)

    Daneva, Maia; Paech, B.; Rolland, C

    2008-01-01

    This paper presents a two-site case study on requirements-based effort estimation practices in enterprise resource planning projects. Specifically, the case study investigated the question of how to handle qualitative data and highly volatile values of project context characteristics. We counterpart

  10. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm

    2014-01-01

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  11. Incorporating Reflective Practice into Team Simulation Projects for Improved Learning Outcomes

    Science.gov (United States)

    Wills, Katherine V.; Clerkin, Thomas A.

    2009-01-01

    The use of simulation games in business courses is a popular method for providing undergraduate students with experiences similar to those they might encounter in the business world. As such, in 2003 the authors were pleased to find a classroom simulation tool that combined the decision-making and team experiences of a senior management group with…

  12. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    Science.gov (United States)

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes

  13. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    Science.gov (United States)

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  14. Rapid generation of control parameters of Multi-Infeed system through online simulation

    Directory of Open Access Journals (Sweden)

    Rasool Aazim

    2017-05-01

    Full Text Available Simulated Self-Generated - Particle Swarm optimization (SSG-PSO toolbox that automatically generates PI control parameters very quickly in PSCAD is designed. This toolbox operates by utilizing transient simulation to evaluate objective function and converges the fitness values of objective function through PSO algorithm during run time simulation of Multi-infeed HVDC systems. Integral Square Error-Objective Function (ISE-OF is used to accomplish the task. To make the toolbox faster, ranges are set for PSO generated value that limit the time of data acquisition for the objective function by only considering transition time of a system. This toolbox has a capability to optimize multiple controllers at same time. The PI values are generated faster and the results are showing markedly improved performance of a system during startup and under fault condition. The experimental results are presented in this paper.

  15. ALEA: a toolbox for allele-specific epigenomics analysis.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Heravi-Moussavi, Alireza; Cheng, Jeffrey B; Costello, Joseph F; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2014-04-15

    The assessment of expression and epigenomic status using sequencing based methods provides an unprecedented opportunity to identify and correlate allelic differences with epigenomic status. We present ALEA, a computational toolbox for allele-specific epigenomics analysis, which incorporates allelic variation data within existing resources, allowing for the identification of significant associations between epigenetic modifications and specific allelic variants in human and mouse cells. ALEA provides a customizable pipeline of command line tools for allele-specific analysis of next-generation sequencing data (ChIP-seq, RNA-seq, etc.) that takes the raw sequencing data and produces separate allelic tracks ready to be viewed on genome browsers. The pipeline has been validated using human and hybrid mouse ChIP-seq and RNA-seq data. The package, test data and usage instructions are available online at http://www.bcgsc.ca/platform/bioinfo/software/alea CONTACT: : mkarimi1@interchange.ubc.ca or sjones@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. MATLAB Stability and Control Toolbox Trim and Static Stability Module

    Science.gov (United States)

    Kenny, Sean P.; Crespo, Luis

    2012-01-01

    MATLAB Stability and Control Toolbox (MASCOT) utilizes geometric, aerodynamic, and inertial inputs to calculate air vehicle stability in a variety of critical flight conditions. The code is based on fundamental, non-linear equations of motion and is able to translate results into a qualitative, graphical scale useful to the non-expert. MASCOT was created to provide the conceptual aircraft designer accurate predictions of air vehicle stability and control characteristics. The code takes as input mass property data in the form of an inertia tensor, aerodynamic loading data, and propulsion (i.e. thrust) loading data. Using fundamental nonlinear equations of motion, MASCOT then calculates vehicle trim and static stability data for the desired flight condition(s). Available flight conditions include six horizontal and six landing rotation conditions with varying options for engine out, crosswind, and sideslip, plus three take-off rotation conditions. Results are displayed through a unique graphical interface developed to provide the non-stability and control expert conceptual design engineer a qualitative scale indicating whether the vehicle has acceptable, marginal, or unacceptable static stability characteristics. If desired, the user can also examine the detailed, quantitative results.

  17. Dynamic simulation applied to the socio-environmental management in projects of concentrated infrastructure

    International Nuclear Information System (INIS)

    Diaz E, Mauricio; Pena Z, Gloria Elena

    2004-01-01

    This work presents a theoretical and methodological approach to system dynamics utilization to contribute in the comprehension and handle of complexly of management challenges what appears in design, construction and operation phases in concentrate infrastructure projects like as ports, big dams and industrial parks. The localization of this kind of projects generates socio environmental impacts in their influence zones, what requires a strategically management from enterprise owners, not only for to comply with current environmental laws but also ensure social viability of their projects

  18. Three-dimensional numerical reservoir simulation of the EGS Demonstration Project at The Geysers geothermal field

    Science.gov (United States)

    Borgia, Andrea; Rutqvist, Jonny; Oldenburg, Curt M.; Hutchings, Lawrence; Garcia, Julio; Walters, Mark; Hartline, Craig; Jeanne, Pierre; Dobson, Patrick; Boyle, Katie

    2013-04-01

    The Enhanced Geothermal System (EGS) Demonstration Project, currently underway at the Northwest Geysers, California, aims to demonstrate the feasibility of stimulating a deep high-temperature reservoir (up to 400 °C) through water injection over a 2-year period. On October 6, 2011, injection of 25 l/s started from the Prati 32 well at a depth interval of 1850-2699 m below sea level. After a period of almost 2 months, the injection rate was raised to 63 l/s. The flow rate was then decreased to 44 l/s after an additional 3.5 months and maintained at 25 l/s up to August 20, 2012. Significant well-head pressure changes were recorded at Prati State 31 well, which is separated from Prati 32 by about 500 m at reservoir level. More subdued pressure increases occur at greater distances. The water injection caused induced seismicity in the reservoir in the vicinity of the well. Microseismic monitoring and interpretation shows that the cloud of seismic events is mainly located in the granitic intrusion below the injection zone, forming a cluster elongated SSE-NNW (azimuth 170°) that dips steeply to the west. In general, the magnitude of the events increases with depth and the hypocenter depth increases with time. This seismic cloud is hypothesized to correlate with enhanced permeability in the high-temperature reservoir and its variation with time. Based on the existing borehole data, we use the GMS™ GUI to construct a realistic three-dimensional (3D) geologic model of the Northwest Geysers geothermal field. This model includes, from the top down, a low permeability graywacke layer that forms the caprock for the reservoir, an isothermal steam zone (known as the normal temperature reservoir) within metagraywacke, a hornfels zone (where the high-temperature reservoir is located), and a felsite layer that is assumed to extend downward to the magmatic heat source. We then map this model onto a rectangular grid for use with the TOUGH2 multiphase, multicomponent, non

  19. CISP: Simulation Platform for Collective Instabilities in the BRing of HIAF project

    Science.gov (United States)

    Liu, J.; Yang, J. C.; Xia, J. W.; Yin, D. Y.; Shen, G. D.; Li, P.; Zhao, H.; Ruan, S.; Wu, B.

    2018-02-01

    To simulate collective instabilities during the complicated beam manipulation in the BRing (Booster Ring) of HIAF (High Intensity heavy-ion Accelerator Facility) or other high intensity accelerators, a code, named CISP (Simulation Platform for Collective Instabilities), is designed and constructed in China's IMP (Institute of Modern Physics). The CISP is a scalable multi-macroparticle simulation platform that can perform longitudinal and transverse tracking when chromaticity, space charge effect, nonlinear magnets and wakes are included. And due to its well object-oriented design, the CISP is also a basic platform used to develop many other applications (like feedback). Several simulations, completed by the CISP in this paper, agree with analytical results very well, which shows that the CISP is fully functional now and it is a powerful platform for the further collective instability research in the BRing or other accelerators. In the future, the CISP can also be extended easily into a physics control system for HIAF or other facilities.

  20. The NINJA-2 project: detecting and characterizing gravitational waveforms modelled using numerical binary black hole simulations

    OpenAIRE

    Aasi, J.; Abbott, B. P.; Abbott, R.; Abernathy, M. R.; Adhikari, Rana X.; Anderson, R.; Anderson, S. B.; Arai, K.; Araya, M. C.; Austin, L.; Barayoga, J. C.; Barish, B. C.; Billingsley, G.; Black, E.; Blackburn, J. K.

    2014-01-01

    The Numerical INJection Analysis (NINJA) project is a collaborative effort between members of the numerical relativity and gravitational-wave (GW) astrophysics communities. The purpose of NINJA is to study the ability to detect GWs emitted from merging binary black holes (BBH) and recover their parameters with next-generation GW observatories. We report here on the results of the second NINJA project, NINJA-2, which employs 60 complete BBH hybrid waveforms consisting of a numerical portion mo...

  1. Evaluation of a Pilot Project to Introduce Simulation-Based Team Training to Pediatric Surgery Trauma Room Care

    Directory of Open Access Journals (Sweden)

    Markus Lehner

    2017-01-01

    Full Text Available Introduction. Several studies in pediatric trauma care have demonstrated substantial deficits in both prehospital and emergency department management. Methods. In February 2015 the PAEDSIM collaborative conducted a one and a half day interdisciplinary, simulation based team-training course in a simulated pediatric emergency department. 14 physicians from the medical fields of pediatric surgery, pediatric intensive care and emergency medicine, and anesthesia participated, as well as four pediatric nurses. After a theoretical introduction and familiarization with the simulator, course attendees alternately participated in six simulation scenarios and debriefings. Each scenario incorporated elements of pediatric trauma management as well as Crew Resource Management (CRM educational objectives. Participants completed anonymous pre- and postcourse questionnaires and rated the course itself as well as their own medical qualification and knowledge of CRM. Results. Participants found the course very realistic and selected scenarios highly relevant to their daily work. They reported a feeling of improved medical and nontechnical skills as well as no uncomfortable feeling during scenarios or debriefings. Conclusion. To our knowledge this pilot-project represents the first successful implementation of a simulation-based team-training course focused on pediatric trauma care in German-speaking countries with good acceptance.

  2. Modeling and Simulation of Longitudinal Dynamics for Low Energy Ring-High Energy Ring at the Positron-Electron Project

    International Nuclear Information System (INIS)

    Rivetta, Claudio; Mastorides, T.; Fox, J.D.; Teytelman, D.; Van Winkle, D.

    2007-01-01

    A time domain dynamic modeling and simulation tool for beam-cavity interactions in the Low Energy Ring (LER) and High Energy Ring (HER) at the Positron-Electron Project (PEP-II) is presented. Dynamic simulation results for PEP-II are compared to measurements of the actual machine. The motivation for this tool is to explore the stability margins and performance limits of PEP-II radio-frequency (RF) systems at future higher currents and upgraded RF configurations. It also serves as a test bed for new control algorithms and can define the ultimate limits of the low-level RF (LLRF) architecture. The time domain program captures the dynamic behavior of the beam-cavity-LLRF interaction based on a reduced model. The ring current is represented by macrobunches. Multiple RF stations in the ring are represented via one or two macrocavities. Each macrocavity captures the overall behavior of all the 2 or 4 cavity RF stations. Station models include nonlinear elements in the klystron and signal processing. This enables modeling the principal longitudinal impedance control loops interacting via the longitudinal beam model. The dynamics of the simulation model are validated by comparing the measured growth rates for the LER with simulation results. The simulated behavior of the LER at increased operation currents is presented via low-mode instability growth rates. Different control strategies are compared and the effects of both the imperfections in the LLRF signal processing and the nonlinear drivers and klystrons are explored

  3. The Drive-Wise Project: Driving Simulator Training increases real driving performance in healthy older drivers

    Directory of Open Access Journals (Sweden)

    Gianclaudio eCasutt

    2014-05-01

    Full Text Available Background: Age-related cognitive decline is often associated with unsafe driving behavior. We hypothesized that 10 active training sessions in a driving simulator increase cognitive and on-road driving performance. In addition, driving simulator training should outperform cognitive training.Methods: Ninety-one healthy active drivers (62 – 87 years were randomly assigned to either (1 a driving simulator training group, (2 an attention training group (vigilance and selective attention, or (3 a control group. The main outcome variables were on-road driving and cognitive performance. Seventy-seven participants (85% completed the training and were included in the analyses. Training gains were analyzed using a multiple regression analysis with planned comparisons.Results: The driving simulator training group showed an improvement in on-road driving performance compared to the attention training group. In addition, both training groups increased cognitive performance compared to the control group. Conclusion: Driving simulator training offers the potential to enhance driving skills in older drivers. Compared to the attention training, the simulator training seems to be a more powerful program for increasing older drivers’ safety on the road.

  4. Economic, energy and environmental evaluations of biomass-based fuel ethanol projects based on life cycle assessment and simulation

    International Nuclear Information System (INIS)

    Yu Suiran; Tao Jing

    2009-01-01

    This paper summarizes the research of Monte Carlo simulation-based Economic, Energy and Environmental (3E) Life Cycle Assessment (LCA) of the three Biomass-based Fuel Ethanol (BFE) projects in China. Our research includes both theoretical study and case study. In the theoretical study part, 3E LCA models are structured, 3E Index Functions are defined and the Monte Carlo simulation is introduced to address uncertainties in BFE life cycle analysis. In the case study part, projects of Wheat-based Fuel Ethanol (WFE) in Central China, Corn-based Fuel Ethanol (CFE) in Northeast China, and Cassava-based Fuel Ethanol (CFE) in Southwest China are evaluated from the aspects of economic viability and investment risks, energy efficiency and airborne emissions. The life cycle economy assessment shows that KFE project in Guangxi is viable, while CFE and WFE projects are not without government's subsidies. Energy efficiency assessment results show that WFE, CFE and KFE projects all have positive Net Energy Values. Emissions results show that the corn-based E10 (a blend of 10% gasoline and 90% ethanol by volume), wheat-based E10 and cassava-base E10 have less CO 2 and VOC life cycle emissions than conventional gasoline, but wheat-based E10 and cassava-based E10 can generate more emissions of CO, CH 4 , N 2 O, NO x , SO 2 , PM 10 and corn-based E10 can has more emissions of CH 4 , N 2 O, NO x , SO, PM 10 .

  5. moviEEG: An animation toolbox for visualization of intracranial electroencephalography synchronization dynamics.

    Science.gov (United States)

    Wong, Simeon M; Ibrahim, George M; Ochi, Ayako; Otsubo, Hiroshi; Rutka, James T; Snead, O Carter; Doesburg, Sam M

    2016-06-01

    We introduce and describe the functions of moviEEG (Multiple Overlay Visualizations for Intracranial ElectroEncephaloGraphy), a novel MATLAB-based toolbox for spatiotemporal mapping of network synchronization dynamics in intracranial electroencephalography (iEEG) data. The toolbox integrates visualizations of inter-electrode phase-locking relationships in peri-ictal epileptogenic networks with signal spectral properties and graph-theoretical network measures overlaid upon operating room images of the electrode grid. Functional connectivity between every electrode pair is evaluated over a sliding window indexed by phase synchrony. Two case studies are presented to provide preliminary evidence for the application of the toolbox to guide network-based mapping of epileptogenic cortex and to distinguish these regions from eloquent brain networks. In both cases, epileptogenic cortex was visually distinct. We introduce moviEEG, a novel toolbox for animation of oscillatory network dynamics in iEEG data, and provide two case studies showing preliminary evidence for utility of the toolbox in delineating the epileptogenic zone. Despite evidence that atypical network synchronization has shown to be altered in epileptogenic brain regions, network based techniques have yet to be incorporated into clinical pre-surgical mapping. moviEEG provides a set of functions to enable easy visualization with network based techniques. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  6. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  7. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    Science.gov (United States)

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Simulating The Impact Of The Material Flow In The Jordanian Construction Supply Chain And Its Impact On Project Performance

    Directory of Open Access Journals (Sweden)

    Dr. Ghaith Al-Werikat

    2017-03-01

    Full Text Available With the new developments and challenges within the construction industry improving the construction supply chain is becoming a major concern to both governments and industries. Improving the construction supply chain helps in improving the quality of construction projects reducing cost wastes delays and other disruptions. This paper discusses the analysis of material flow in the construction supply chain. The methodology consisted of preliminary investigations survey and simulation development to analyse the extent of impact that material flow has on construction projects in Jordan. Both the main survey and the investigations revealed that material flow delays are caused mainly by 3 types of delays late delivery wrong specification and material damaged on site. The highest impact regarding late deliveries was scaffolding with a 16 probability of occurrence a 2-day delay on the activitys duration. Concrete ranked highest regarding wrong specification with a 19 probability of occurrence an 8-day delay the activitys duration. Regarding materials damaged on site bricks ranked highest with a 9 probability of occurrence a 3-day delay on the duration. The simulation results exhibited a delay of 50 on the projects duration and a probability of a delay occurring is 9.2.

  9. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    Science.gov (United States)

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  10. Polyphilic Interactions as Structural Driving Force Investigated by Molecular Dynamics Simulation (Project 7

    Directory of Open Access Journals (Sweden)

    Christopher Peschel

    2017-09-01

    Full Text Available We investigated the effect of fluorinated molecules on dipalmitoylphosphatidylcholine (DPPC bilayers by force-field molecular dynamics simulations. In the first step, we developed all-atom force-field parameters for additive molecules in membranes to enable an accurate description of those systems. On the basis of this force field, we performed extensive simulations of various bilayer systems containing different additives. The additive molecules were chosen to be of different size and shape, and they included small molecules such as perfluorinated alcohols, but also more complex molecules. From these simulations, we investigated the structural and dynamic effects of the additives on the membrane properties, as well as the behavior of the additive molecules themselves. Our results are in good agreement with other theoretical and experimental studies, and they contribute to a microscopic understanding of interactions, which might be used to specifically tune membrane properties by additives in the future.

  11. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  12. Toolboxes for cyanobacteria: Recent advances and future direction.

    Science.gov (United States)

    Sun, Tao; Li, Shubin; Song, Xinyu; Diao, Jinjin; Chen, Lei; Zhang, Weiwen

    2018-05-03

    Photosynthetic cyanobacteria are important primary producers and model organisms for studying photosynthesis and elements cycling on earth. Due to the ability to absorb sunlight and utilize carbon dioxide, cyanobacteria have also been proposed as renewable chassis for carbon-neutral "microbial cell factories". Recent progresses on cyanobacterial synthetic biology have led to the successful production of more than two dozen of fuels and fine chemicals directly from CO 2 , demonstrating their potential for scale-up application in the future. However, compared with popular heterotrophic chassis like Escherichia coli and Saccharomyces cerevisiae, where abundant genetic tools are available for manipulations at levels from single gene, pathway to whole genome, limited genetic tools are accessible to cyanobacteria. Consequently, this significant technical hurdle restricts both the basic biological researches and further development and application of these renewable systems. Though still lagging the heterotrophic chassis, the vital roles of genetic tools in tuning of gene expression, carbon flux re-direction as well as genome-wide manipulations have been increasingly recognized in cyanobacteria. In recent years, significant progresses on developing and introducing new and efficient genetic tools have been made for cyanobacteria, including promoters, riboswitches, ribosome binding site engineering, clustered regularly interspaced short palindromic repeats/CRISPR-associated nuclease (CRISPR/Cas) systems, small RNA regulatory tools and genome-scale modeling strategies. In this review, we critically summarize recent advances on development and applications as well as technical limitations and future directions of the genetic tools in cyanobacteria. In addition, toolboxes feasible for using in large-scale cultivation are also briefly discussed. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Hemostats, sealants, and adhesives: components of the surgical toolbox.

    Science.gov (United States)

    Spotnitz, William D; Burks, Sandra

    2008-07-01

    The surgical toolbox is expanding, and newer products are being developed to improve results. Reducing blood loss so that bloodless surgery can be performed may help minimize morbidity and length of stay. As patients, hospital administrators, and government regulators desire less invasive procedures, the surgical technical challenge is increasing. More operations are being performed through minimally invasive incisions with laparoscopic, endoscopic, and robotic approaches. In this setting, tools that can reduce bleeding by causing blood to clot, sealing vessels, or gluing tissues are gaining an increasing importance. Thus, hemostats, sealants, and adhesives are becoming a more important element of surgical practice. This review is designed to facilitate the reader's basic knowledge of these tools so that informed choices are made for controlling bleeding in specific clinical situations. Such information is useful for all members of the operative team. The team includes surgeons, anesthesiologists, residents, and nurses as well as hematologists and other medical specialists who may be involved in the perioperative care of surgical patients. An understanding of these therapeutic options may also be helpful to the transfusion service. In some cases, these materials may be stored in the blood bank, and their appropriate use may reduce demand for other transfusion components. The product classification used in this review includes hemostats as represented by product categories that include mechanical agents, active agents, flowables, and fibrin sealants; sealants as represented by fibrin sealants and polyethylene glycol hydrogels; and adhesives as represented by cyanoacrylates and albumin cross-linked with glutaraldehyde. Only those agents approved by the Food and Drug Administration (FDA) and presently available (February 2008) for sale in the United States are discussed in this review.

  14. Assessment of simulated and projected climate change in Pakistan using IPCC AR4-based AOGCMs

    Science.gov (United States)

    Saeed, F.; Athar, H.

    2017-11-01

    A detailed spatio-temporal assessment of two basic climatic parameters (temperature and precipitation) is carried out using 22 Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4)-based atmospheric oceanic general circulation models (AOGCMs) over data-sparse and climatically vulnerable region of Pakistan (20°-37° N and 60°-78° E), for the first time, for the baseline period (1975-1999), as well as for the three projected periods during the twenty-first century centered at 2025-2049, 2050-2074, and 2075-2099, respectively, both on seasonal and on annual bases, under three Special Report on Emission Scenarios (SRES): A2, A1B, and B1. An ensemble-based approach consisting of the IPCC AR4-based AOGCMs indicates that during the winter season (from December to March), 66% of the models display robust projected increase of winter precipitation by about 10% relative to the baseline period, irrespective of emission scenario and projection period, in the upper northern subregion of Pakistan (latitude > 35° N). The projected robust changes in the temperature by the end of twenty-first century are in the range of 3 to 4 ° C during the winter season and on an annual basis, in the central and western regions of Punjab province, especially in A2 and A1B emission scenarios. In particular, the IPCC AR4 models project a progressive increase in temperature throughout Pakistan, in contrast to spatial distribution of precipitation, where spatially less uniform and robust results for projected periods are obtained on sign of change. In general, changes in both precipitation and temperature are larger in the summer season (JAS) as compared to the winter season in the coming decades, relative to the baseline period. This may require comprehensive long-term strategic policies to adapt and mitigate climate change in Pakistan, in comparison to what is currently envisaged.

  15. The E-MOSAICS project: simulating the formation and co-evolution of galaxies and their star cluster populations

    Science.gov (United States)

    Pfeffer, Joel; Kruijssen, J. M. Diederik; Crain, Robert A.; Bastian, Nate

    2018-04-01

    We introduce the MOdelling Star cluster population Assembly In Cosmological Simulations within EAGLE (E-MOSAICS) project. E-MOSAICS incorporates models describing the formation, evolution, and disruption of star clusters into the EAGLE galaxy formation simulations, enabling the examination of the co-evolution of star clusters and their host galaxies in a fully cosmological context. A fraction of the star formation rate of dense gas is assumed to yield a cluster population; this fraction and the population's initial properties are governed by the physical properties of the natal gas. The subsequent evolution and disruption of the entire cluster population are followed accounting for two-body relaxation, stellar evolution, and gravitational shocks induced by the local tidal field. This introductory paper presents a detailed description of the model and initial results from a suite of 10 simulations of ˜L⋆ galaxies with disc-like morphologies at z = 0. The simulations broadly reproduce key observed characteristics of young star clusters and globular clusters (GCs), without invoking separate formation mechanisms for each population. The simulated GCs are the surviving population of massive clusters formed at early epochs (z ≳ 1-2), when the characteristic pressures and surface densities of star-forming gas were significantly higher than observed in local galaxies. We examine the influence of the star formation and assembly histories of galaxies on their cluster populations, finding that (at similar present-day mass) earlier-forming galaxies foster a more massive and disruption-resilient cluster population, while galaxies with late mergers are capable of forming massive clusters even at late cosmic epochs. We find that the phenomenological treatment of interstellar gas in EAGLE precludes the accurate modelling of cluster disruption in low-density environments, but infer that simulations incorporating an explicitly modelled cold interstellar gas phase will overcome

  16. Discrete event simulation of NASA's Remote Exploration and Experimentation Project (REE)

    Science.gov (United States)

    Dunphy, J.; Rogstad, S.

    2001-01-01

    The Remote Exploration and Experimentation Project (REE) is a new initiative at JPL to be able to place a supercomputer on board a spacecraft and allow large amounts of data reduction and compression to be done before science results are returned to Earth.

  17. The 1993 timber assessment market model: structure, projections, and policy simulations.

    Science.gov (United States)

    Darius M. Adams; Richard W. Haynes

    1996-01-01

    The 1993 timber assessment market model (TAMM) is a spatial model of the solidwood and timber inventory elements of the U.S. forest products sector. The TAMM model provides annual projections of volumes and prices in the solidwood products and sawtimber stumpage markets and estimates of total timber harvest and inventory by geographic region for periods of up to 50...

  18. Greenland ice sheet surface mass balance: evaluating simulations and making projections with regional climate models

    NARCIS (Netherlands)

    Rae, J.G.L.; Aðalgeirsdóttir, G.; Edwards, T.L.; Fettweis, X.; Gregory, J.M.; Hewitt, H.T.; Lowe, J.A.; Lucas-Picher, P.; Mottram, R.H.; Payne, A.J.; Ridley, J.K.; Shannon, S.R.; van de Berg, W.J.; van de Wal, R.S.W.; van den Broeke, M.R.

    2012-01-01

    Four high-resolution regional climate models (RCMs) have been set up for the area of Greenland, with the aim of providing future projections of Greenland ice sheet surface mass balance (SMB), and its contribution to sea level rise, with greater accuracy than is possible from coarser-resolution

  19. A Tire Gasification Senior Design Project That Integrates Laboratory Experiments and Computer Simulation

    Science.gov (United States)

    Weiss, Brian; Castaldi, Marco J.

    2006-01-01

    A reactor to convert waste rubber tires to useful products such as CO and H2, was investigated in a university undergraduate design project. The student worked individually with mentorship from a faculty professor who aided the student with professional critique. The student was able to research the background of the field and conceive of a novel…

  20. Modeling and Simulation for Enterprise Decision-Making: Successful Projects and Approaches

    DEFF Research Database (Denmark)

    Ramadan, Noha; Ajami, Racha; Mohamed, Nader

    2015-01-01

    Decision-making in enterprises holds different possibilities for profits and risks. Due to the complexity of decision making processes, modeling and simulation tools are being used to facilitate them and minimize the risk of making wrong decisions in the various business process phases. In this p...

  1. Multimodel simulations of carbon monoxide: Comparison with observations and projected near-future changes

    NARCIS (Netherlands)

    Shindell, D.T.; Faluvegi, G.; Stevenson, D.S.; Krol, M.C.; Emmons, L.K.; Lamarque, J.F.; Petron, G.; Dentener, F.J.; Ellingsen, K.; Schultz, M.G.; Wild, O.; Amann, M.; Atherton, C.S.; Bergmann, D.J.; Bey, I.; Butler, T.; Cofala, J.; Collins, W.J.; Derwent, R.G.; Doherty, R.M.; Drevet, J.; Eskes, H.J.; Fiore, A.M.; Gauss, M.; Hauglustaine, D.A.; Horowitz, L.W.; Isaksen, I.S.A.; Lawrence, M.G.; Montanaro, V.; Muller, J.F.; Pitari, G.; Prather, M.J.; Pyle, J.A.; Rast, S.; Rodriguez, J.M.; Sanderson, M.G.; Savage, N.H.; Strahan, S.E.; Sudo, K.; Szopa, S.; Unger, N.; Noije, van T.P.C.; Zeng, G.

    2006-01-01

    We analyze present-day and future carbon monoxide (CO) simulations in 26 state-of-the-art atmospheric chemistry models run to study future air quality and climate change. In comparison with near-global satellite observations from the MOPITT instrument and local surface measurements, the models show

  2. Data Release of UV to Submillimeter Broadband Fluxes for Simulated Galaxies from the EAGLE Project

    Science.gov (United States)

    Camps, Peter; Trčka, Ana; Trayford, James; Baes, Maarten; Theuns, Tom; Crain, Robert A.; McAlpine, Stuart; Schaller, Matthieu; Schaye, Joop

    2018-02-01

    We present dust-attenuated and dust emission fluxes for sufficiently resolved galaxies in the EAGLE suite of cosmological hydrodynamical simulations, calculated with the SKIRT radiative transfer code. The post-processing procedure includes specific components for star formation regions, stellar sources, and diffuse dust and takes into account stochastic heating of dust grains to obtain realistic broadband fluxes in the wavelength range from ultraviolet to submillimeter. The mock survey includes nearly half a million simulated galaxies with stellar masses above {10}8.5 {M}ȯ across six EAGLE models. About two-thirds of these galaxies, residing in 23 redshift bins up to z = 6, have a sufficiently resolved metallic gas distribution to derive meaningful dust attenuation and emission, with the important caveat that the same dust properties were used at all redshifts. These newly released data complement the already publicly available information about the EAGLE galaxies, which includes intrinsic properties derived by aggregating the properties of the smoothed particles representing matter in the simulation. We further provide an open-source framework of Python procedures for post-processing simulated galaxies with the radiative transfer code SKIRT. The framework allows any third party to calculate synthetic images, spectral energy distributions, and broadband fluxes for EAGLE galaxies, taking into account the effects of dust attenuation and emission.

  3. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    Science.gov (United States)

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  4. Multimodel simulations of carbon monoxide: comparison with observations and projected near-future changes

    NARCIS (Netherlands)

    Shindell, D.T.; Krol, M.C.

    2006-01-01

    We analyze present-day and future carbon monoxide (CO) simulations in 26 state-ofthe- art atmospheric chemistry models run to study future air quality and climate change. In comparison with near-global satellite observations from the MOPITT instrument and local surface measurements, the models show

  5. Using simulation pedagogy to enhance teamwork and communication in the care of older adults: the ELDER project.

    Science.gov (United States)

    Mager, Diana R; Lange, Jean W; Greiner, Philip A; Saracino, Katherine H

    2012-08-01

    The Expanded Learning and Dedication to Elders in the Region (ELDER) project addressed the needs of under-served older adults by educating health care providers in home health and long-term care facilities. Four agencies in a health professional shortage/medically underserved area participated. Focus groups were held to determine agency-specific educational needs. Curricula from the John A. Hartford Foundation were adapted to design unique curricula for each agency and level of personnel during the first 2 years. The focus of this report is the case-based simulation learning approach used in year 3 to validate application of knowledge and facilitate teamwork and interprofessional communication. Three simulation sessions on varying topics were conducted at each site. Postsimulation surveys and qualitative interviews with hired evaluators showed that participants found simulations helpful to their practice. Tailored on-site education incorporating mid-fidelity simulation was an effective model for translating gerontological knowledge into practice and encouraging communication and teamwork in these settings. Copyright 2012, SLACK Incorporated.

  6. Designed Green Toolbox as built environment educating method-analytical comparison between two groups of students with different cultural background

    NARCIS (Netherlands)

    El Fiky, U.; Hamdy, I.; Fikry, M.

    2006-01-01

    This paper is concerned with evaluating and testing the application process of green architecture design strategies using a tool-box as a built environment educating method and a pre-design reminder. Understanding the suggested green design strategies, testing the tool-box effectiveness,

  7. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  8. ObsPy: A Python toolbox for seismology - Sustainability, New Features, and Applications

    Science.gov (United States)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; MacCarthy, J.

    2016-12-01

    ObsPy (https://www.obspy.org) is a community-driven, open-source project dedicated to offer a bridge for seismology into the scientific Python ecosystem. Amongst other things, it provides Read and write support for essentially every commonly used data format in seismology with a unified interface. This includes waveform data as well as station and event meta information. A signal processing toolbox tuned to the specific needs of seismologists. Integrated access to the largest data centers, web services, and databases. Wrappers around third party codes like libmseed and evalresp. Using ObsPy enables users to take advantage of the vast scientific ecosystem that has developed around Python. In contrast to many other programming languages and tools, Python is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often must be translated to stable and production ready environments, especially in the age of big data. ObsPy has seen constant development for more than six years and enjoys a large rate of adoption in the seismological community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. We will present a short overview of the capabilities of ObsPy and point out several representative use cases and more specialized software built around ObsPy. Additionally we will discuss new and upcoming features, as well as the sustainability of open-source scientific software.

  9. iELVis: An open source MATLAB toolbox for localizing and visualizing human intracranial electrode data.

    Science.gov (United States)

    Groppe, David M; Bickel, Stephan; Dykstra, Andrew R; Wang, Xiuyuan; Mégevand, Pierre; Mercier, Manuel R; Lado, Fred A; Mehta, Ashesh D; Honey, Christopher J

    2017-04-01

    Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELVis's functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. ObsPy: A Python Toolbox for Seismology - Recent Developments and Applications

    Science.gov (United States)

    Megies, T.; Krischer, L.; Barsch, R.; Sales de Andrade, E.; Beyreuther, M.

    2014-12-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project dedicated to building a bridge for seismology into the scientific Python ecosystem. It offersa) read and write support for essentially all commonly used waveform, station, and event metadata file formats with a unified interface,b) a comprehensive signal processing toolbox tuned to the needs of seismologists,c) integrated access to all large data centers, web services and databases, andd) convenient wrappers to legacy codes like libtau and evalresp.Python, currently the most popular language for teaching introductory computer science courses at top-ranked U.S. departments, is a full-blown programming language with the flexibility of an interactive scripting language. Its extensive standard library and large variety of freely available high quality scientific modules cover most needs in developing scientific processing workflows. Together with packages like NumPy, SciPy, Matplotlib, IPython, Pandas, lxml, and PyQt, ObsPy enables the construction of complete workflows in Python. These vary from reading locally stored data or requesting data from one or more different data centers through to signal analysis and data processing and on to visualizations in GUI and web applications, output of modified/derived data and the creation of publication-quality figures.ObsPy enjoys a large world-wide rate of adoption in the community. Applications successfully using it include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. All functionality is extensively documented and the ObsPy tutorial and gallery give a good impression of the wide range of possible use cases.We will present the basic features of ObsPy, new developments and applications, and a roadmap for the near future and discuss the sustainability of our open-source development model.

  11. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  12. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  13. Process evaluation of a Toolbox-training program for construction foremen in Denmark

    DEFF Research Database (Denmark)

    Jeschke, Katharina Christiane; Kines, Pete; Rasmussen, Liselotte

    2017-01-01

    foremen’s knowledge and communication skills in daily planning of work tasks and their related OSH risks on construction sites. The program builds on the popular 'toolbox meeting' concept, however there is very little research evaluating these types of meetings. This article describes the development...... and revised until the final version after the fifth group. The evaluation utilized an action research strategy with a mixed–methods approach of triangulating questionnaire, interview, and observation data. Process evaluation results showed that the eight Toolbox-training topics were relevant and useful...

  14. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  15. Imbalanced-learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning

    OpenAIRE

    Lemaitre , Guillaume; Nogueira , Fernando; Aridas , Christos ,

    2017-01-01

    International audience; imbalanced-learn is an open-source python toolbox aiming at providing a wide range of methods to cope with the problem of imbalanced dataset frequently encountered in machine learning and pattern recognition. The implemented state-of-the-art methods can be categorized into 4 groups: (i) under-sampling, (ii) over-sampling, (iii) combination of over-and under-sampling, and (iv) ensemble learning methods. The proposed toolbox depends only on numpy, scipy, and scikit-learn...

  16. The simulation of man-machine interaction in NPPs: the system response analyser project

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1990-01-01

    In this paper, the ongoing research at Joint Research Centre-Ispra on the simulation of man-machine interaction is reviewed with reference to the past experience of system modelling and to the advances of the technological world. These require the coalescence of mixed disciplines covering the fields of engineering, psychology and sociology. In particular, the complexity of man-machine systems with respect to safety analysis is depicted. The developments and issues in modelling humans and machines are discussed: the possibility of combining them through the System Response Analyser methodology is presented as a balanced to be applied when the objective is the study of safety of systems during abnormal sequences. The three analytical tools which constitute the body of system response analysis namely a quasi-classical simulation of the actual plant, a cognitive model of the operator activities and a driver model, are described. (author)

  17. FANS Simulation of Propeller Wash at Navy Harbors (ESTEP Project ER-201031)

    Science.gov (United States)

    2016-08-01

    referenced herein. Dell® is a registered trademark of Dell, Inc. IBM® is a registered trademark of International Business Machines Corporation. lntel...grid blocks such as those encountered wave- current -body interactions and vortex-induced vibrations. The FANS code consists of the following main...Reynolds stress (second-moment) and two-layer k-ε turbulence models for turbulent boundary layer and wake flows; (4) large eddy simulation for

  18. The Dust Management Project: Characterizing Lunar Environments and Dust, Developing Regolith Mitigation Technology and Simulants

    Science.gov (United States)

    Hyatt, Mark J.; Straka, Sharon A.

    2010-01-01

    A return to the Moon to extend human presence, pursue scientific activities, use the Moon to prepare for future human missions to Mars, and expand Earth?s economic sphere, will require investment in developing new technologies and capabilities to achieve affordable and sustainable human exploration. From the operational experience gained and lessons learned during the Apollo missions, conducting long-term operations in the lunar environment will be a particular challenge, given the difficulties presented by the unique physical properties and other characteristics of lunar regolith, including dust. The Apollo missions and other lunar explorations have identified significant lunar dust-related problems that will challenge future mission success. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it?s potentially harmful effects on exploration systems and human explorers. The Dust Management Project (DMP) is tasked with the evaluation of lunar dust effects, assessment of the resulting risks, and development of mitigation and management strategies and technologies related to Exploration Systems architectures. To this end, the DMP supports the overall goal of the Exploration Technology Development Program (ETDP) of addressing the relevant high priority technology needs of multiple elements within the Constellation Program (CxP) and sister ETDP projects. Project scope, plans, and accomplishments will be presented.

  19. Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project

    Science.gov (United States)

    Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott

    2015-11-01

    Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.

  20. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  1. Design and Simulation of an Air Conditioning Project in a Hospital Based on Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    Ding X. R.

    2017-06-01

    Full Text Available This study aims to design a novel air cleaning facility which conforms to the current situation in China, and moreover can satisfy our demand on air purification under the condition of poor air quality, as well as discuss the development means of a prototype product. Air conditions in the operating room of a hospital were measured as the research subject of this study. First, a suitable turbulence model and boundary conditions were selected and computational fluid dynamics (CFD software was used to simulate indoor air distribution. The analysis and comparison of the simulation results suggested that increasing the area of air supply outlets and the number of return air inlets would not only increase the area of unidirectional flow region in main flow region, but also avoid an indoor vortex and turbulivity of the operating area. Based on the summary of heat and humidity management methods, the system operation mode and relevant parameter technologies as well as the characteristics of the thermal-humidity load of the operating room were analyzed and compiled. According to the load value and parameters of indoor design obtained after our calculations, the airflow distribution of purifying the air-conditioning system in a clean operating room was designed and checked. The research results suggested that the application of a secondary return air system in the summer could reduce energy consumption and be consistent with the concept of primaiy humidity control. This study analyzed the feasibility and energy conservation properties of cleaning air-conditioning technology in operating rooms, proposed some solutions to the problem, and performed a feasible simulation, which provides a reference for practical engineering.

  2. UCSD Performance in the Edge Plasma Simulation (EPSI) Project. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Tynan, George Robert [Univ. of California, San Diego, CA (United States). Center for Energy Research

    2017-12-12

    This report contains a final report on the activities of UC San Diego PI G.R. Tynan and his collaborators as part of the EPSI Project, that was led by Dr. C.S. Chang, from PPPL. As a part of our work, we carried out several experiments on the ALCATOR C-­MOD tokamak device, aimed at unraveling the “trigger” or cause of the spontaneous transition from low-­mode confinement (L-­mode) to high confinement (H-­mode) that is universally observed in tokamak devices, and is planned for use in ITER.

  3. Electron-cloud simulation studies for the CERN-PS in the framework of the LHC Injectors Upgrade project

    CERN Document Server

    Rioja Fuentelsaz, Sergio

    The present study aims to provide a consistent picture of the electron cloud effect in the CERN Proton Synchrotron (PS) and to investigate possible future limitations due to the requirements foreseen by the LHC Injectors Upgrade (LIU) project. It consists of a complete simulation survey of the electron cloud build-up in the different beam pipe sections of the ring depending on several controllable beam parameters and vacuum chamber surface properties, covering present and future operation parameters. As the combined function magnets of the accelerator constitute almost the $80\\%$ in length of the ring, the implementation of a new feature for the simulation of any external magnetic field on the PyECLOUD code, made it possible to perform this study. All the results of the simulations are given as a function of the vacuum chamber surface properties in order to deduce them, both locally and globally, when compared with experimental data. In a first step, we characterize locally the maximum possible number of ...

  4. Response of permafrost to projected climate change: Results from global offline model simulations with JSBACH

    Science.gov (United States)

    Blome, Tanja; Ekici, Altug; Beer, Christian; Hagemann, Stefan

    2014-05-01

    Permafrost or perennially frozen ground is an important part of the terrestrial cryosphere; roughly one quarter of Earth's land surface is underlain by permafrost. As it is a thermal phenomenon, its characteristics are highly dependent on climatic factors. The impact of the currently observed warming, which is projected to persist during the coming decades due to anthropogenic CO2 input, certainly has effects for the vast permafrost areas of the high northern latitudes. The quantification of these effects, however, is scientifically still an open question. This is partly due to the complexity of the system, where several feedbacks are interacting between land and atmosphere, sometimes counterbalancing each other. Moreover, until recently, many global circulation models (GCMs) lacked the sufficient representation of permafrost physics in their land surface schemes. In order to assess the response of permafrost to projected climate change for the 21st century, the land surface scheme of the Max-Planck-Institute for Meteorology, JSBACH, has recently been equipped with the important physical processes for permafrost studies, and was driven globally with bias corrected climate data, thereby spanning a period from 1850 until 2100. The applied land surface scheme JSBACH now considers the effects of freezing and thawing of soil water for both energy and water cycles, thermal properties depending on soil water and ice contents, and soil moisture movement being influenced by the presence of soil ice. To address the uncertainty range arising through different greenhouse gas concentrations as well as through different climate realisations when using various climate models, combinations of two Representative Concentration Pathways (RCPs) and two GCMs were used as driving data. In order to focus only on the climatic impact on permafrost, effects due to feedbacks between climate and permafrost (namely via carbon fluxes between land and atmosphere) are excluded in the experiments

  5. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  6. The Ohio River Valley CO2 Storage Project AEP Mountaineer Plant, West Virginia Numerical Simulation and Risk Assessment Report

    Energy Technology Data Exchange (ETDEWEB)

    Neeraj Gupta

    2008-03-31

    A series of numerical simulations of carbon dioxide (CO{sub 2}) injection were conducted as part of a program to assess the potential for geologic sequestration in deep geologic reservoirs (the Rose Run and Copper Ridge formations), at the American Electric Power (AEP) Mountaineer Power Plant outside of New Haven, West Virginia. The simulations were executed using the H{sub 2}O-CO{sub 2}-NaCl operational mode of the Subsurface Transport Over Multiple Phases (STOMP) simulator (White and Oostrom, 2006). The objective of the Rose Run formation modeling was to predict CO{sub 2} injection rates using data from the core analysis conducted on the samples. A systematic screening procedure was applied to the Ohio River Valley CO{sub 2} storage site utilizing the Features, Elements, and Processes (FEP) database for geological storage of CO{sub 2} (Savage et al., 2004). The objective of the screening was to identify potential risk categories for the long-term geological storage of CO{sub 2} at the Mountaineer Power Plant in New Haven, West Virginia. Over 130 FEPs in seven main classes were assessed for the project based on site characterization information gathered in a geological background study, testing in a deep well drilled on the site, and general site conditions. In evaluating the database, it was apparent that many of the items were not applicable to the Mountaineer site based its geologic framework and environmental setting. Nine FEPs were identified for further consideration for the site. These FEPs generally fell into categories related to variations in subsurface geology, well completion materials, and the behavior of CO{sub 2} in the subsurface. Results from the screening were used to provide guidance on injection system design, developing a monitoring program, performing reservoir simulations, and other risk assessment efforts. Initial work indicates that the significant FEPs may be accounted for by focusing the storage program on these potential issues. The

  7. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    Science.gov (United States)

    Keane, Robert E.; Rollins, Matthew; Zhu, Zhi-Liang

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed.

  8. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    Science.gov (United States)

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  9. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Science.gov (United States)

    2010-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Pre-filtration treatment toolbox...

  10. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    Science.gov (United States)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  11. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    Science.gov (United States)

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  12. Par@Graph - a parallel toolbox for the construction and analysis of large complex climate networks

    NARCIS (Netherlands)

    Tantet, A.J.J.

    2015-01-01

    In this paper, we present Par@Graph, a software toolbox to reconstruct and analyze complex climate networks having a large number of nodes (up to at least 106) and edges (up to at least 1012). The key innovation is an efficient set of parallel software tools designed to leverage the inherited hybrid

  13. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    Science.gov (United States)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  14. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    Science.gov (United States)

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  15. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Caroline Looms, Majken

    2013-01-01

    on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods...

  16. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  17. An integrated GIS-MARKAL toolbox for designing a CO2 infrastructure network in the Netherlands

    NARCIS (Netherlands)

    van den Broek, M.A.; Brederode, E.; Ramirez, C.A.; Kramers, K.; van der Kuip, M.; Wildenborg, T.; Faaij, A.P.C.; Turkenburg, W.C.

    2009-01-01

    Large-scale implementation of carbon capture and storage needs a whole new infrastructure to transport and store CO2. Tools that can support planning and designing of such infrastructure require incorporation of both temporal and spatial aspects. Therefore, a toolbox that integrates ArcGIS, a

  18. The Italian Project S2 - Task 4:Near-fault earthquake ground motion simulation in the Sulmona alluvial basin

    Science.gov (United States)

    Stupazzini, M.; Smerzini, C.; Cauzzi, C.; Faccioli, E.; Galadini, F.; Gori, S.

    2009-04-01

    Recently the Italian Department of Civil Protection (DPC), in cooperation with Istituto Nazionale di Geofisica e Vulcanologia (INGV) has promoted the 'S2' research project (http://nuovoprogettoesse2.stru.polimi.it/) aimed at the design, testing and application of an open-source code for seismic hazard assessment (SHA). The tool envisaged will likely differ in several important respects from an existing international initiative (Open SHA, Field et al., 2003). In particular, while "the OpenSHA collaboration model envisions scientists developing their own attenuation relationships and earthquake rupture forecasts, which they will deploy and maintain in their own systems", the main purpose of S2 project is to provide a flexible computational tool for SHA, primarily suited for the needs of DPC, which not necessarily are scientific needs. Within S2, a crucial issue is to make alternative approaches available to quantify the ground motion, with emphasis on the near field region. The SHA architecture envisaged will allow for the use of ground motion descriptions other than those yielded by empirical attenuation equations, for instance user generated motions provided by deterministic source and wave propagation simulations. In this contribution, after a brief presentation of Project S2, we intend to illustrate some preliminary 3D scenario simulations performed in the alluvial basin of Sulmona (Central Italy), as an example of the type of descriptions that can be handled in the future SHA architecture. In detail, we selected some seismogenic sources (from the DISS database), believed to be responsible for a number of destructive historical earthquakes, and derive from them a family of simplified geometrical and mechanical source models spanning across a reasonable range of parameters, so that the extent of the main uncertainties can be covered. Then, purely deterministic (for frequencies Journal of Seismology, 1, 237-251. Field, E.H., T.H. Jordan, and C.A. Cornell (2003

  19. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  20. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    Science.gov (United States)

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment

  1. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    Science.gov (United States)

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost

  2. Teaching Sustainable Design Using BIM and Project-Based Energy Simulations

    Directory of Open Access Journals (Sweden)

    Zhigang Shen

    2012-08-01

    Full Text Available The cross-disciplinary nature of energy-efficient building design has created many challenges for architecture, engineering and construction instructors. One of the technical challenges in teaching sustainable building design is enabling students to quantitatively understand how different building designs affect a building’s energy performance. Concept based instructional methods fall short in evaluating the impact of different design choices on a buildings’ energy consumption. Building Information Modeling (BIM with energy performance software provides a feasible tool to evaluate building design parameters. One notable advantage of this tool is its ability to couple 3D visualization of the structure with energy performance analysis without requiring detailed mathematical and thermodynamic calculations. Project-based Learning (PBL utilizing BIM tools coupled with energy analysis software was incorporated into a senior level undergraduate class. Student perceptions and feedback were analyzed to gauge the effectiveness of these techniques as instructional tools. The findings indicated that BIM-PBL can be used to effectively teach energy-efficient building design and construction.

  3. Characterization of projected DWPF glasses heat treated to simulate canister centerline cooling

    International Nuclear Information System (INIS)

    Marra, S.L.; Jantzen, C.M.

    1992-05-01

    Liquid high-level nuclear waste will be immobilized at the Savannah River Site (SRS) by vitrification in borosilicate glass. The glass will be produced and poured into stainless steel canisters in the Defense Waste Processing Facility (DWPF). Eventually these canistered waste forms will be sent to a geologic repository for final disposal. In order to assure acceptability by the repository, the Department of Energy has defined requirements which DWPF canistered waste forms must meet. These requirements are the Waste Acceptance Preliminary Specifications (WAPS). The WAPS require DWPF to identify the crystalline phases expected to be present in the final glass product. Knowledge of the thermal history of the borosilicate glass during filling and cooldown of the canister is necessary to determine the amount and type of crystalline phases present in the final glass product. Glass samples of seven projected DWPF compositions were cooled following the same temperature profile as that of glass at the centerline of the full-scale DWPF canister. The glasses were characterized by x-ray diffraction and scanning electron microscopy to identify the crystalline phases present The volume percents of each crystalline phase present were determined by quantitative x-ray diffraction. The Product Consistency Test (PCI) was used to determine the durability of the heat-treated glasses

  4. Ferrocyanide Safety Project: Comparison of actual and simulated ferrocyanide waste properties

    International Nuclear Information System (INIS)

    Scheele, R.D.; Burger, L.L.; Sell, R.L.; Bredt, P.R.; Barrington, R.J.

    1994-09-01

    In the 1950s, additional high-level radioactive waste storage capacity was needed to accommodate the wastes that would result from the production of recovery of additional nuclear defense materials. To provide this additional waste storage capacity, the Hanford Site operating contractor developed a process to decontaminate aqueous wastes by precipitating radiocesium as an alkali nickel ferrocyanide; this process allowed disposal of the aqueous waste. The radiocesium scavenging process as developed was used to decontaminate (1) first-cycle bismuth phosphate (BiPO 4 ) wastes, (2) acidic wastes resulting from uranium recovery operations, and (3) the supernate from neutralized uranium recovery wastes. The radiocesium scavenging process was often coupled with other scavenging processes to remove radiostrontium and radiocobalt. Because all defense materials recovery processes used nitric acid solutions, all of the wastes contained nitrate, which is a strong oxidizer. The variety of wastes treated, and the occasional coupling of radiostrontium and radiocobalt scavenging processes with the radiocesium scavenging process, resulted in ferrocyanide-bearing wastes having many different compositions. In this report, we compare selected physical, chemical, and radiochemical properties measured for Tanks C-109 and C-112 wastes and selected physical and chemical properties of simulated ferrocyanide wastes to assess the representativeness of stimulants prepared by WHC

  5. Simulation of dambreak flood with erosion effects (CEA R and D Project 718-G-641)

    International Nuclear Information System (INIS)

    Ko, P.Y.

    1990-01-01

    Most existing mathematical models applicable to dambreak analysis assume the river channel to be rigid. In reality, during the passage of dambreak waves, the banks and the bed of the valley will be eroded by the flood waves, affecting flood levels. A study was carried out to produce a numerical model suitable for use on a personal computer for the simulation of the dambreak wave along erosion-prone channels. The following features were determined to be essential: nonuniform and non-equilibrium transport of graded sediment should be considered; the user should be able to use the sediment transport function of choice; channel roughness should reflect the change of the river channel; armoring of the channel bed should be included; and bank erosion should be considered. Details are given of the mathematical analysis of dam failure, dynamic flood routing, and sediment routing. Preliminary testing showed that the model is usable to perform routing of a dambreak wave along an erodible river channel. Additional options may be added which include various hydraulic structures, description of debris flow, etc. With the inclusion of a width adjustment algorithm, it is able to estimate the vulnerability of river banks, which will be important for civil protection agencies in the preparation of emergency preparedness plans. 23 refs., 7 figs

  6. Use of Generalized Fluid System Simulation Program (GFSSP) for Teaching and Performing Senior Design Projects at the Educational Institutions

    Science.gov (United States)

    Majumdar, A. K.; Hedayat, A.

    2015-01-01

    This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.

  7. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    Science.gov (United States)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  8. National Institutes of Health Toolbox Emotion Battery for English- and Spanish-speaking adults: normative data and factor-based summary scores.

    Science.gov (United States)

    Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K

    2018-01-01

    The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.

  9. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  10. EasyCloneYALI: CRISPR/Cas9-based synthetic toolbox for engineering of the yeast Yarrowia lipolytica

    DEFF Research Database (Denmark)

    Holkenbrink, Carina; Dam, Marie Inger; Kildegaard, Kanchana Rueksomtawin

    2018-01-01

    . Here, we present the EasyCloneYALI genetic toolbox, which allows streamlined strain construction with high genome editing efficiencies in Y. lipolytica via the CRISPR/Cas9 technology. The toolbox allows marker-free integration of gene expression vectors into characterized genome sites as well as marker......-free deletion of genes with the help of CRISPR/Cas9. Genome editing efficiencies above 80% were achieved with transformation protocols using non-replicating DNA repair fragments (such as DNA oligos). Furthermore, the toolbox includes a set of integrative gene expression vectors with prototrophic markers...

  11. Dynamics of epiretinal membrane removal off the retinal surface: a computer simulation project.

    Science.gov (United States)

    Dogramaci, Mahmut; Williamson, Tom H

    2013-09-01

    To use a computer simulation to discern the safest angle at which to peel epiretinal membranes. We used ANSYS V.14.1 software to analyse the dynamics involved in membrane removal off the retinal surface. The geometrical values were taken from optical coherence tomography of 30 eyes with epiretinal membranes. A range of Young's modulus values of 0.03, 0.01 and 0.09 MPa were assigned to the epiretinal membrane and to the retina separately. The ratio of maximum shear stress (MSS) recorded at the attachment pegs over that recorded at the membrane (P/E ratio) was determined at nine displacement angles (DA). Mean MSS values recorded at the attachment pegs, epiretinal membrane and retina were significantly different at 0.8668, 0.6091 and 0.0017 Pa consecutively (p<0.05). There was a significant negative linear correlation between DA and MSS recorded at the epiretinal membrane when the Young's modulus for the epiretinal membrane was higher than or equal to that for the attachment pegs and the retina. Nevertheless, there was a significant positive linear correlation between DA and P/E ratio when the Young's modulus for the epiretinal membrane was equal to or lower than that for the attachment pegs and the retina. Attachment pegs appear to be the most likely part to fail (tear) during removal procedures. Changing the direction at which the edge of the membrane is pulled can relocate the MSS within in the tissue complex. Safer and effective removal could be achieved by pulling epiretinal membranes onto themselves at 165° DA.

  12. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP): Overview and Description of Models, Simulations and Climate Diagnostics

    Science.gov (United States)

    Lamarque, J.-F.; Shindell, D. T.; Naik, V.; Plummer, D.; Josse, B.; Righi, M.; Rumbold, S. T.; Schulz, M.; Skeie, R. B.; Strode, S.; hide

    2013-01-01

    The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) consists of a series of time slice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting composition changes and the associated radiative forcing. In this overview paper, we introduce the ACCMIP activity, the various simulations performed (with a requested set of 14) and the associated model output. The 16 ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions are responsible for a significant range across models, mostly in the case of ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind) reveals biases consistent with state-of-the-art climate models. The model-to- model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results. However, models that are clear outliers are different enough from the other models to significantly affect their simulation of atmospheric chemistry.

  13. Simulation of the effect of an oil refining project on the water environment using the MIKE 21 model

    Science.gov (United States)

    Jia, Peng; Wang, Qinggai; Lu, Xuchuan; Zhang, Beibei; Li, Chen; Li, Sa; Li, Shibei; Wang, Yaping

    2018-02-01

    A case study of the Caofeidian oil refining project is conducted. A two-dimensional convective dispersion mathematical model is established to simulate the increase in the concentration of pollutants resulting from the wastewater discharge from the Caofeidian oil refining project and to analyze the characteristics of the dispersion of pollutants after wastewater is discharged and the effect of the wastewater discharge on the surrounding sea areas. The results demonstrate the following: (1) The Caofeidian sea area has strong tidal currents, which are significantly affected by the terrain. There are significant differences in the tidal current velocity and the direction between the deep-water areas and the shoals. The direction of the tidal currents in the deep-water areas is essentially parallel to the contour lines of the sea areas. Onshore currents and rip currents submerging the shoals are the dominant currents in the shoals. (2) The pollutant concentration field in the offshore areas changes periodically with the movement of the tidal current. The dilution and dispersion of pollutants are affected by the ocean currents in different tidal periods. The turbulent dispersion of pollutants is the most intense when a neap tide ebbs, followed by when a neap tide rises, when a spring tide ebbs and when a spring tide rises. (3) There are relatively good hydrodynamic conditions near the project's wastewater discharge outlet. Wastewater is well diluted after being discharged. Areas with high concentrations of pollutants are concentrated near the wastewater discharge outlet and the offshore areas. These pollutants migrate southwestward with the flood tidal current and northeastward with the ebb tidal current and have no significant impact on the protection targets in the open sea areas and nearby sea areas.

  14. Development of a High-Resolution Climate Model for Future Climate Change Projection on the Earth Simulator

    Science.gov (United States)

    Kanzawa, H.; Emori, S.; Nishimura, T.; Suzuki, T.; Inoue, T.; Hasumi, H.; Saito, F.; Abe-Ouchi, A.; Kimoto, M.; Sumi, A.

    2002-12-01

    The fastest supercomputer of the world, the Earth Simulator (total peak performance 40TFLOPS) has recently been available for climate researches in Yokohama, Japan. We are planning to conduct a series of future climate change projection experiments on the Earth Simulator with a high-resolution coupled ocean-atmosphere climate model. The main scientific aims for the experiments are to investigate 1) the change in global ocean circulation with an eddy-permitting ocean model, 2) the regional details of the climate change including Asian monsoon rainfall pattern, tropical cyclones and so on, and 3) the change in natural climate variability with a high-resolution model of the coupled ocean-atmosphere system. To meet these aims, an atmospheric GCM, CCSR/NIES AGCM, with T106(~1.1o) horizontal resolution and 56 vertical layers is to be coupled with an oceanic GCM, COCO, with ~ 0.28ox 0.19o horizontal resolution and 48 vertical layers. This coupled ocean-atmosphere climate model, named MIROC, also includes a land-surface model, a dynamic-thermodynamic seaice model, and a river routing model. The poles of the oceanic model grid system are rotated from the geographic poles so that they are placed in Greenland and Antarctic land masses to avoild the singularity of the grid system. Each of the atmospheric and the oceanic parts of the model is parallelized with the Message Passing Interface (MPI) technique. The coupling of the two is to be done with a Multi Program Multi Data (MPMD) fashion. A 100-model-year integration will be possible in one actual month with 720 vector processors (which is only 14% of the full resources of the Earth Simulator).

  15. A Study of Using Simulation to Overcome Obstacles That Block the Implementation of Critical Chain Project Management to Project Management Environment

    OpenAIRE

    Chia-Ling Huang; Rong-Kwei Li; Chih-Hung Tsai; Yi-Chan Chung; Yao-Wen Hsu

    2014-01-01

    Since 1997, the Critical Chain Project Management (CCPM) method has received considerable attention. Hundreds of successful CCPM cases have achieved highly reliable on-time delivery (OTD) with short project lead-time (PLT) in multi-project environments. However, two obstacles have remained, blocking the implementation of CCPM to project management (PM) society. The first has been addressed by PM practitioners, who have been less than confident that OTD and PLT can be significantly improved by...

  16. Real-time "x-ray vision" for healthcare simulation: an interactive projective overlay system to enhance intubation training and other procedural training.

    Science.gov (United States)

    Samosky, Joseph T; Baillargeon, Emma; Bregman, Russell; Brown, Andrew; Chaya, Amy; Enders, Leah; Nelson, Douglas A; Robinson, Evan; Sukits, Alison L; Weaver, Robert A

    2011-01-01

    We have developed a prototype of a real-time, interactive projective overlay (IPO) system that creates augmented reality display of a medical procedure directly on the surface of a full-body mannequin human simulator. These images approximate the appearance of both anatomic structures and instrument activity occurring within the body. The key innovation of the current work is sensing the position and motion of an actual device (such as an endotracheal tube) inserted into the mannequin and using the sensed position to control projected video images portraying the internal appearance of the same devices and relevant anatomic structures. The images are projected in correct registration onto the surface of the simulated body. As an initial practical prototype to test this technique we have developed a system permitting real-time visualization of the intra-airway position of an endotracheal tube during simulated intubation training.

  17. Modelling Project Feasibility Robustness by Use of Scenarios

    DEFF Research Database (Denmark)

    Moshøj, Claus Rehfeld; Leleur, Steen

    1998-01-01

    , SEAM secures a consistent inclusion of actual scenario elements in the quantitative impact modelling and facilitates a transparent project feasibility robustness analysis. SEAM is implemented as part of a decision support system with a toolbox structure applicable to different types of transport...

  18. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave

    Directory of Open Access Journals (Sweden)

    Ikaro Silva

    2014-09-01

    Full Text Available The WaveForm DataBase (WFDB Toolbox for MATLAB/Octave enables  integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox allows direct loading into MATLAB/Octave's workspace of over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by meta data such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  19. The Cementitious Barriers Partnership (CBP) Software Toolbox Capabilities in Assessing the Degradation of Cementitious Barriers - 13487

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P.; Burns, H.H.; Langton, C.; Smith, F.G. III [Savannah River National Laboratory, Savannah River Site, Aiken SC 29808 (United States); Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, Nashville, TN (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Petten (Netherlands); Samson, E. [SIMCO Technologies Inc., 1400, boul. du Parc-Technologique, Suite 203, Quebec (Canada); Mallick, P.; Suttora, L. [United States Department of Energy, 1000 Independence Ave. SW, Washington, DC (United States); Esh, D.W.; Fuhrmann, M.J.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in K{sub d}/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP

  20. The Cementitious Barriers Partnership (CBP) Software Toolbox Capabilities In Assessing The Degradation Of Cementitious Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States); Burns, H. H. [Savannah River Site (SRS), Aiken, SC (United States); Langton, C. [Savannah River Site (SRS), Aiken, SC (United States); Smith, F. G. III [Savannah River Site (SRS), Aiken, SC (United States); Brown, K. G. [Vanderbilt University, Nashville, TN (United States); Kosson, D. S. [Vanderbilt University, Nashville, TN (United States); Garrabrants, A. C. [Vanderbilt University, Nashville, TN (United States); Sarkar, S. [Vanderbilt University, Nashville, TN (United States); van der Sloot, H. [Hans van der Sloot Consultancy (The Netherlands); Meeussen, J. C.L. [Nuclear Research and Consultancy Group, Petten (The Netherlands); Samson, E. [SIMCO Technologies Inc. , 1400, boul. du Parc - Technologique , Suite 203, Quebec (Canada); Mallick, P. [United States Department of Energy, 1000 Independence Ave. SW , Washington, DC (United States); Suttora, L. [United States Department of Energy, 1000 Independence Ave. SW , Washington, DC (United States); Esh, D. W. [U .S. Nuclear Regulatory Commission , Washington, DC (United States); Fuhrmann, M. J. [U .S. Nuclear Regulatory Commission , Washington, DC (United States); Philip, J. [U .S. Nuclear Regulatory Commission , Washington, DC (United States)

    2013-01-11

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in Kd/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP software