WorldWideScience

Sample records for project simulation toolbox

  1. Object Oriented Toolbox for Modelling and Simulation of Dynamical Systems

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin; Wagner, Falko Jens; Thomsen, Per Grove

    1998-01-01

    This paper presents the results of an ongoing project, dealing with design and implementation of a simulation toolbox based on object oriented modelling techniques. The paper describes an experimental implementation of parts of such a toolbox in C++, and discusses the experiences drawn from that ...... that process. Essential to the work is the focus on simulation of complex dynamical systems, from modelling the single components/subsystems to building complete systemssuch a toolbox in C++, and discusses the experiences drawn from that process....

  2. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  3. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    Science.gov (United States)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  4. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...... gravity field, Earth magnetic field and eclipse. The structure and facilities within the toolbox are described and exemplified using a student satellite case (AAUSAT-II). The validity of developed models is confirmed by comparing the simulation results with the realistic data obtained from the Danish...

  5. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  6. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    Science.gov (United States)

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  7. OSSIM wave-optics toolbox and its use to simulate AEOS

    Science.gov (United States)

    Smith, Carey A.; Forgham, James L.; Jones, Bruce W.; Jones, Kenneth D.

    2001-12-01

    OSSim (Optical System Simulation) is a simulation toolbox of optical and processing components. By using full wave-optics in the time-domain, OSSim simulates diffractive effects and control loop interactions missed by simpler analyses. OSSim also models the atmosphere, with user customizable turbulence strength, wind, and slew. This paper first presents 2 introductory examples: a simple 2-lens imaging system and a simple tilt-control system. Then it presents a simulation of the 3.67-meter AEOS (Advanced Electro-Optics System) telescope on Maui. The OSSim simulation agrees well with the AEOS experimental results.

  8. System design through Matlab, control toolbox and Simulink

    CERN Document Server

    Singh, Krishna K

    2001-01-01

    MATLAB , a software package developed by Math Works, Inc. is powerful, versatile and interactive software for scientific and technical computations including simulations. Specialised toolboxes provided with several built-in functions are a special feature of MATLAB . This book titled System Design through MATLAB , Control Toolbox and SIMULINK aims at getting the reader started with computations and simulations in system engineering quickly and easily and then proceeds to build concepts for advanced computations and simulations that includes the control and compensation of systems. Simulation through SIMULINK has also been described to allow the reader to get the feel of the real world situation. This book is appropriate for undergraduate students undergoing final semester of their project work, postgraduate students who have MATLAB integrated in their course or wish to take up simulation problem in the area of system engineering for their dissertation work and research scholars for whom MATLABÊ

  9. An Open-Source Toolbox for PEM Fuel Cell Simulation

    Directory of Open Access Journals (Sweden)

    Jean-Paul Kone

    2018-05-01

    Full Text Available In this paper, an open-source toolbox that can be used to accurately predict the distribution of the major physical quantities that are transported within a proton exchange membrane (PEM fuel cell is presented. The toolbox has been developed using the Open Source Field Operation and Manipulation (OpenFOAM platform, which is an open-source computational fluid dynamics (CFD code. The base case results for the distribution of velocity, pressure, chemical species, Nernst potential, current density, and temperature are as expected. The plotted polarization curve was compared to the results from a numerical model and experimental data taken from the literature. The conducted simulations have generated a significant amount of data and information about the transport processes that are involved in the operation of a PEM fuel cell. The key role played by the concentration constant in shaping the cell polarization curve has been explored. The development of the present toolbox is in line with the objectives outlined in the International Energy Agency (IEA, Paris, France Advanced Fuel Cell Annex 37 that is devoted to developing open-source computational tools to facilitate fuel cell technologies. The work therefore serves as a basis for devising additional features that are not always feasible with a commercial code.

  10. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  11. Towards a binaural modelling toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Culling, John F.; Dau, Torsten

    2011-01-01

    The Auditory Modelling Toolbox (AMToolbox) is a new Matlab / Octave toolbox for developing and applying auditory perceptual models and in particular binaural models. The philosophy behind the project is that the models should be implemented in a consistent manner, well documented and user...

  12. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    Science.gov (United States)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment

  13. User's manual for Ecolego Toolbox and the Discretization Block

    International Nuclear Information System (INIS)

    Broed, Robert; Shulan Xu

    2008-03-01

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  14. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    2002-01-01

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  15. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    Directory of Open Access Journals (Sweden)

    Michael James Hull

    2014-01-01

    Full Text Available The broad structure of a modelling study can often be explained over a cup of coffee, butconverting this high-level conceptual idea into graphs of the final simulation results may requiremany weeks of sitting at a computer. Although models themselves can be complex, oftenmany mental resources are wasted working around complexities of the software ecosystemsuch as fighting to manage files, interfacing between tools and data formats, finding mistakesin code or working out the units of variables. morphforge is a high-level, Python toolboxfor building and managing simulations of small populations of multicompartmental biophysicalmodel neurons. An entire in silico experiment, including the definition of neuronal morphologies,channel descriptions, stimuli, visualisation and analysis of results can be written within a singleshort Python script using high-level objects. Multiple independent simulations can be createdand run from a single script, allowing parameter spaces to be investigated. Consideration hasbeen given to the reuse of both algorithmic and parameterisable components to allow bothspecific and stochastic parameter variations. Some other features of the toolbox include: theautomatic generation of human-readable documentation (e. g. PDF-files about a simulation; thetransparent handling of different biophysical units; a novel mechanism for plotting simulationresults based on a system of tags; and an architecture that supports both the use of establishedformats for defining channels and synapses (e. g. MODL files, and the possibility to supportother libraries and standards easily. We hope that this toolbox will allow scientists to quicklybuild simulations of multicompartmental model neurons for research and serve as a platform forfurther tool development.

  16. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  17. The ABRAVIBE toolbox for teaching vibration analysis and structural dynamics

    DEFF Research Database (Denmark)

    Brandt, A.

    2013-01-01

    , a MATLAB toolbox (the ABRAVIBE toolbox) has been developed as an accompanying toolbox for the recent book "Noise and Vibration Analysis" by the author. This free, open software, published under GNU Public License, can be used with GNU Octave, if an entirely free software platform is wanted, with a few...... functional limitations. The toolbox includes functionality for simulation of mechanical models as well as advanced analysis such as time series analysis, spectral analysis, frequency response and correlation function estimation, modal parameter extraction, and rotating machinery analysis (order tracking...

  18. User's manual for Ecolego Toolbox and the Discretization Block

    Energy Technology Data Exchange (ETDEWEB)

    Broed, Robert (Facilia consulting AB (Sweden)); Shulan Xu (Swedish Radiation Protection Authority, Stockholm (Sweden))

    2008-03-15

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  19. ARTS, the Atmospheric Radiative Transfer Simulator - version 2.2, the planetary toolbox edition

    Science.gov (United States)

    Buehler, Stefan A.; Mendrok, Jana; Eriksson, Patrick; Perrin, Agnès; Larsson, Richard; Lemke, Oliver

    2018-04-01

    This article describes the latest stable release (version 2.2) of the Atmospheric Radiative Transfer Simulator (ARTS), a public domain software for radiative transfer simulations in the thermal spectral range (microwave to infrared). The main feature of this release is a planetary toolbox that allows simulations for the planets Venus, Mars, and Jupiter, in addition to Earth. This required considerable model adaptations, most notably in the area of gaseous absorption calculations. Other new features are also described, notably radio link budgets (including the effect of Faraday rotation that changes the polarization state) and the treatment of Zeeman splitting for oxygen spectral lines. The latter is relevant, for example, for the various operational microwave satellite temperature sensors of the Advanced Microwave Sounding Unit (AMSU) family.

  20. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.

    2011-02-25

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. Results: The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user\\'s models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. © The Author 2011. Published by Oxford University Press. All rights reserved.

  1. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...

  2. Presentation of the International Building Physics Toolbox for Simulink

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Sasic Kalagasidis, Angela; Nielsen, Toke Rammer

    2003-01-01

    The international building physics toolbox (IBPT) is a software library specially constructed for HAM system analysis in building physics. The toolbox is constructed as a modular structure of the standard building elements using the graphical programming language Simulink. Two research groups have...... participated in this project. In order to enable the development of the toolbox, a common modelling platform was defined: a set of unique communication signals, material database and documentation protocol. The IBPT is open source and publicly available on the Internet. Any researcher and student can use...

  3. Fast modal simulation of paraxial optical systems: the MIST open source toolbox

    International Nuclear Information System (INIS)

    Vajente, Gabriele

    2013-01-01

    This paper presents a new approach to the simulation of optical laser systems in the paraxial approximation, with particular applications to interferometric gravitational wave detectors. The method presented here is based on a standard decomposition of the laser field in terms of Hermite–Gauss transverse modes. The innovative feature consists of a symbolic manipulation of the equations describing the field propagation. This approach allows a huge reduction in the computational time, especially when a large number of higher order modes is needed to properly simulate the system. The new algorithm has been implemented in an open source toolbox, called the MIST, based on the MATLAB® environment. The MIST has been developed and is being used in the framework of the design of advanced gravitational wave detectors. Examples from this field of application will be discussed to illustrate the capabilities and performance of the simulation tool. (paper)

  4. SELANSI: a toolbox for simulation of stochastic gene regulatory networks.

    Science.gov (United States)

    Pájaro, Manuel; Otero-Muras, Irene; Vázquez, Carlos; Alonso, Antonio A

    2018-03-01

    Gene regulation is inherently stochastic. In many applications concerning Systems and Synthetic Biology such as the reverse engineering and the de novo design of genetic circuits, stochastic effects (yet potentially crucial) are often neglected due to the high computational cost of stochastic simulations. With advances in these fields there is an increasing need of tools providing accurate approximations of the stochastic dynamics of gene regulatory networks (GRNs) with reduced computational effort. This work presents SELANSI (SEmi-LAgrangian SImulation of GRNs), a software toolbox for the simulation of stochastic multidimensional gene regulatory networks. SELANSI exploits intrinsic structural properties of gene regulatory networks to accurately approximate the corresponding Chemical Master Equation with a partial integral differential equation that is solved by a semi-lagrangian method with high efficiency. Networks under consideration might involve multiple genes with self and cross regulations, in which genes can be regulated by different transcription factors. Moreover, the validity of the method is not restricted to a particular type of kinetics. The tool offers total flexibility regarding network topology, kinetics and parameterization, as well as simulation options. SELANSI runs under the MATLAB environment, and is available under GPLv3 license at https://sites.google.com/view/selansi. antonio@iim.csic.es. © The Author(s) 2017. Published by Oxford University Press.

  5. A conceptual toolbox for designing CSCW applications

    DEFF Research Database (Denmark)

    Bødker, Susanne; Christiansen, Ellen

    1995-01-01

    This paper presents a conceptual toolbox, developed to support the design of CSCW applications in a large Esprit project, EuroCODE. Here, several groups of designers work to investigate computer support for cooperative work in large use organizations, at the same time as they work to develop...... an open development platform for CSCW applications. The conceptual toolbox has been developed to support communication in and among these design groups, between designers and users and in future use of the open development platform. Rejecting the idea that one may design from a framework describing CSCW......, the toolbox aims to support design by doing and help bridging between work with users, technical design, and insights gained from theoretical and empirical CSCW research....

  6. Broadview Radar Altimetry Toolbox

    Science.gov (United States)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific

  7. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    Science.gov (United States)

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  8. ESA's Multi-mission Sentinel-1 Toolbox

    Science.gov (United States)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for

  9. The Self-Powered Detector Simulation `MATiSSe' Toolbox applied to SPNDs for severe accident monitoring in PWRs

    Science.gov (United States)

    Barbot, Loïc; Villard, Jean-François; Fourrez, Stéphane; Pichon, Laurent; Makil, Hamid

    2018-01-01

    In the framework of the French National Research Agency program on nuclear safety and radioprotection, the `DIstributed Sensing for COrium Monitoring and Safety' project aims at developing innovative instrumentation for corium monitoring in case of severe accident in a Pressurized Water nuclear Reactor. Among others, a new under-vessel instrumentation based on Self-Powered Neutron Detectors is developed using a numerical simulation toolbox, named `MATiSSe'. The CEA Instrumentation Sensors and Dosimetry Lab developed MATiSSe since 2010 for Self-Powered Neutron Detectors material selection and geometry design, as well as for their respective partial neutron and gamma sensitivity calculations. MATiSSe is based on a comprehensive model of neutron and gamma interactions which take place in Selfpowered neutron detector components using the MCNP6 Monte Carlo code. As member of the project consortium, the THERMOCOAX SAS Company is currently manufacturing some instrumented pole prototypes to be tested in 2017. The full severe accident monitoring equipment, including the standalone low current acquisition system, will be tested during a joined CEA-THERMOCOAX experimental campaign in some realistic irradiation conditions, in the Slovenian TRIGA Mark II research reactor.

  10. A Module for Graphical Display of Model Results with the CBP Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide a basis for further development of the CBP Toolbox.

  11. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB.

    Science.gov (United States)

    Klingbeil, Guido; Erban, Radek; Giles, Mike; Maini, Philip K

    2011-04-15

    The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user's models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. The software is open source under the GPL v3 and available at http://www.maths.ox.ac.uk/cmb/STOCHSIMGPU. The web site also contains supplementary information. klingbeil@maths.ox.ac.uk Supplementary data are available at Bioinformatics online.

  12. ARC Code TI: X-Plane Communications Toolbox (XPC)

    Data.gov (United States)

    National Aeronautics and Space Administration — The X-Plane Communications Toolbox (XPC) is an open source research tool used to interact with the commercial flight simulator software X-Plane. XPC allows users to...

  13. Wyrm: A Brain-Computer Interface Toolbox in Python.

    Science.gov (United States)

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  14. Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities

    Science.gov (United States)

    Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.

    2015-12-01

    In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation

  15. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    Science.gov (United States)

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  16. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    Science.gov (United States)

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  17. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  18. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, III, F. G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM@ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased level of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM@ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM@ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that

  19. The Linear Time Frequency Analysis Toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Torrésani, Bruno; Balazs, Peter

    2011-01-01

    The Linear Time Frequency Analysis Toolbox is a Matlab/Octave toolbox for computational time-frequency analysis. It is intended both as an educational and computational tool. The toolbox provides the basic Gabor, Wilson and MDCT transform along with routines for constructing windows (lter...... prototypes) and routines for manipulating coe cients. It also provides a bunch of demo scripts devoted either to demonstrating the main functions of the toolbox, or to exemplify their use in specic signal processing applications. In this paper we describe the used algorithms, their mathematical background...

  20. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  1. DACE - A Matlab Kriging Toolbox

    DEFF Research Database (Denmark)

    2002-01-01

    DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models.......DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models....

  2. Object-oriented Matlab adaptive optics toolbox

    Science.gov (United States)

    Conan, R.; Correia, C.

    2014-08-01

    Object-Oriented Matlab Adaptive Optics (OOMAO) is a Matlab toolbox dedicated to Adaptive Optics (AO) systems. OOMAO is based on a small set of classes representing the source, atmosphere, telescope, wavefront sensor, Deformable Mirror (DM) and an imager of an AO system. This simple set of classes allows simulating Natural Guide Star (NGS) and Laser Guide Star (LGS) Single Conjugate AO (SCAO) and tomography AO systems on telescopes up to the size of the Extremely Large Telescopes (ELT). The discrete phase screens that make the atmosphere model can be of infinite size, useful for modeling system performance on large time scales. OOMAO comes with its own parametric influence function model to emulate different types of DMs. The cone effect, altitude thickness and intensity profile of LGSs are also reproduced. Both modal and zonal modeling approach are implemented. OOMAO has also an extensive library of theoretical expressions to evaluate the statistical properties of turbulence wavefronts. The main design characteristics of the OOMAO toolbox are object-oriented modularity, vectorized code and transparent parallel computing. OOMAO has been used to simulate and to design the Multi-Object AO prototype Raven at the Subaru telescope and the Laser Tomography AO system of the Giant Magellan Telescope. In this paper, a Laser Tomography AO system on an ELT is simulated with OOMAO. In the first part, we set-up the class parameters and we link the instantiated objects to create the source optical path. Then we build the tomographic reconstructor and write the script for the pseudo-open-loop controller.

  3. The GMT/MATLAB Toolbox

    Science.gov (United States)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  4. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    Science.gov (United States)

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  5. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  6. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  7. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Science.gov (United States)

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  8. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    Science.gov (United States)

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  9. Reprocessing Close Range Terrestrial and Uav Photogrammetric Projects with the Dbat Toolbox for Independent Verification and Quality Control

    Science.gov (United States)

    Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.

    2017-11-01

    Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  10. REPROCESSING CLOSE RANGE TERRESTRIAL AND UAV PHOTOGRAMMETRIC PROJECTS WITH THE DBAT TOOLBOX FOR INDEPENDENT VERIFICATION AND QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    A. Murtiyoso

    2017-11-01

    Full Text Available Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles, close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  11. C++ Toolbox for Object-Oriented Modeling and Dynamic Simulation of Physical Systems

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1999-01-01

    This paper presents the efforts made in an ongoing project that exploits the advantages of using object-oriented methodologies for describing and simulating dynamical systems. The background for this work is a search for new and better ways to simulate physical systems.......This paper presents the efforts made in an ongoing project that exploits the advantages of using object-oriented methodologies for describing and simulating dynamical systems. The background for this work is a search for new and better ways to simulate physical systems....

  12. SHynergie: Development of a virtual project laboratory for monitoring hydraulic stimulations

    Science.gov (United States)

    Renner, Jörg; Friederich, Wolfgang; Meschke, Günther; Müller, Thomas; Steeb, Holger

    2016-04-01

    Hydraulic stimulations are the primary means of developing subsurface reservoirs regarding the extent of fluid transport in them. The associated creation or conditioning of a system of hydraulic conduits involves a range of hydraulic and mechanical processes but also chemical reactions, such as dissolution and precipitation, may affect the stimulation result on time scales as short as hours. In the light of the extent and complexity of these processes, the steering potential for the operator of a stimulation critically depends on the ability to integrate the maximum amount of site-specific information with profound process understanding and a large spectrum of experience. We report on the development of a virtual project laboratory for monitoring hydraulic stimulations within the project SHynergie (http://www.ruhr-uni-bochum.de/shynergie/). The concept of the laboratory envisioned product that constitutes a preparing and accompanying rather than post-processing instrument ultimately accessible to persons responsible for a project over a web-repository. The virtual laboratory consists of a data base, a toolbox, and a model-building environment. Entries in the data base are of two categories. On the one hand, selected mineral and rock properties are provided from the literature. On the other hand, project-specific entries of any format can be made that are assigned attributes regarding their use in a stimulation problem at hand. The toolbox is interactive and allows the user to perform calculations of effective properties and simulations of different types (e.g., wave propagation in a reservoir, hydraulic test). The model component is also hybrid. The laboratory provides a library of models reflecting a range of scenarios but also allows the user to develop a site-specific model constituting the basis for simulations. The laboratory offers the option to use its components following the typical workflow of a stimulation project. The toolbox incorporates simulation

  13. The Life Cycle Analysis Toolbox

    International Nuclear Information System (INIS)

    Bishop, L.; Tonn, B.E.; Williams, K.A.; Yerace, P.; Yuracko, K.L.

    1999-01-01

    The life cycle analysis toolbox is a valuable integration of decision-making tools and supporting materials developed by Oak Ridge National Laboratory (ORNL) to help Department of Energy managers improve environmental quality, reduce costs, and minimize risk. The toolbox provides decision-makers access to a wide variety of proven tools for pollution prevention (P2) and waste minimization (WMin), as well as ORNL expertise to select from this toolbox exactly the right tool to solve any given P2/WMin problem. The central element of the toolbox is a multiple criteria approach to life cycle analysis developed specifically to aid P2/WMin decision-making. ORNL has developed numerous tools that support this life cycle analysis approach. Tools are available to help model P2/WMin processes, estimate human health risks, estimate costs, and represent and manipulate uncertainties. Tools are available to help document P2/WMin decision-making and implement programs. Tools are also available to help track potential future environmental regulations that could impact P2/WMin programs and current regulations that must be followed. An Internet-site will provide broad access to the tools

  14. Online model evaluation of large-eddy simulations covering Germany with a horizontal resolution of 156 m

    Science.gov (United States)

    Hansen, Akio; Ament, Felix; Lammert, Andrea

    2017-04-01

    Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.

  15. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  16. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  17. Tolkku - a toolbox for decision support from condition monitoring data

    International Nuclear Information System (INIS)

    Saarela, Olli; Lehtonen, Mikko; Halme, Jari; Aikala, Antti; Raivio, Kimmo

    2012-01-01

    This paper describes a software toolbox (a software library) designed for condition monitoring and diagnosis of machines. This toolbox implements both new methods and prior art and is aimed for practical down-to-earth data analysis work. The target is to improve knowledge of the operation and behaviour of machines and processes throughout their entire life-cycles. The toolbox supports different phases of condition based maintenance with tools that extract essential information and automate data processing. The paper discusses principles that have guided toolbox design and the implemented toolbox structure. Case examples are used to illustrate how condition monitoring applications can be built using the toolbox. In the first case study the toolbox is applied to fault detection of industrial centrifuges based on measured electrical current. The second case study outlines an application for centralized monitoring of a fleet of machines that supports organizational learning.

  18. Air Sensor Toolbox

    Science.gov (United States)

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  19. MBEToolbox: a Matlab toolbox for sequence data analysis in molecular biology and evolution

    Directory of Open Access Journals (Sweden)

    Xia Xuhua

    2005-03-01

    Full Text Available Abstract Background MATLAB is a high-performance language for technical computing, integrating computation, visualization, and programming in an easy-to-use environment. It has been widely used in many areas, such as mathematics and computation, algorithm development, data acquisition, modeling, simulation, and scientific and engineering graphics. However, few functions are freely available in MATLAB to perform the sequence data analyses specifically required for molecular biology and evolution. Results We have developed a MATLAB toolbox, called MBEToolbox, aimed at filling this gap by offering efficient implementations of the most needed functions in molecular biology and evolution. It can be used to manipulate aligned sequences, calculate evolutionary distances, estimate synonymous and nonsynonymous substitution rates, and infer phylogenetic trees. Moreover, it provides an extensible, functional framework for users with more specialized requirements to explore and analyze aligned nucleotide or protein sequences from an evolutionary perspective. The full functions in the toolbox are accessible through the command-line for seasoned MATLAB users. A graphical user interface, that may be especially useful for non-specialist end users, is also provided. Conclusion MBEToolbox is a useful tool that can aid in the exploration, interpretation and visualization of data in molecular biology and evolution. The software is publicly available at http://web.hku.hk/~jamescai/mbetoolbox/ and http://bioinformatics.org/project/?group_id=454.

  20. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    Science.gov (United States)

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  1. ESA Atmospheric Toolbox

    Science.gov (United States)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and

  2. Sentinel-3 SAR Altimetry Toolbox

    Science.gov (United States)

    Benveniste, Jerome; Lucas, Bruno; DInardo, Salvatore

    2015-04-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage of ERS-2 and Envisat, and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the two Sentinels is expected to be launched in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth

  3. ICT: isotope correction toolbox.

    Science.gov (United States)

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    Science.gov (United States)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE

  5. Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox

    Directory of Open Access Journals (Sweden)

    Andre Santos Ribeiro

    2015-07-01

    Full Text Available Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity.Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI and positron emission tomography (PET. It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also.Results. It was observed both a high inter

  6. Structural Time Domain Identification Toolbox User's Guide

    DEFF Research Database (Denmark)

    Andersen, P.; Kirkegaard, Poul Henning; Brincker, Rune

    This manual describes the Structural Time Domain Identification toolbox for use with MA TLAB. This version of the tool box has been developed using the PC-based MA TLAB version 4.2c, but is compatible with prior versions of MATLAB and UNIX-based versions. The routines of the toolbox are the so...

  7. iamxt: Max-tree toolbox for image processing and analysis

    Directory of Open Access Journals (Sweden)

    Roberto Souza

    2017-01-01

    Full Text Available The iamxt is an array-based max-tree toolbox implemented in Python using the NumPy library for array processing. It has state of the art methods for building and processing the max-tree, and a large set of visualization tools that allow to view the tree and the contents of its nodes. The array-based programming style and max-tree representation used in the toolbox make it simple to use. The intended audience of this toolbox includes mathematical morphology students and researchers that want to develop research in the field and image processing researchers that need a toolbox simple to use and easy to integrate in their applications.

  8. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  9. The simulation of a non-linear unstable system of motion, utilising the Solid-Works and Matlab/Simulink extended libraries/toolboxes

    Directory of Open Access Journals (Sweden)

    Zátopek Jiří

    2016-01-01

    Full Text Available This text discusses the use and integration of various support software tools for the purpose of designing the motion control law governing mechanical structures with strongly non-linear behaviour. The detailed mathematical model is derived using Lagrange Equations of the Second Type. The physical model was designed by using SolidWorks 3D CAD software and a SimMechanics library. It extends Simulink with modelling tools for the simulation of mechanical “multi-domain” physical systems. The visualization of Simulink outputs is performed using the 3D Animation toolbox. Control law - designed on the basis of the mathematical model, is tested for both models (i.e. mathematical and physical and the regulatory processes’ results are compared.

  10. DeltaProt: a software toolbox for comparative genomics

    Directory of Open Access Journals (Sweden)

    Willassen Nils P

    2010-11-01

    Full Text Available Abstract Background Statistical bioinformatics is the study of biological data sets obtained by new micro-technologies by means of proper statistical methods. For a better understanding of environmental adaptations of proteins, orthologous sequences from different habitats may be explored and compared. The main goal of the DeltaProt Toolbox is to provide users with important functionality that is needed for comparative screening and studies of extremophile proteins and protein classes. Visualization of the data sets is also the focus of this article, since visualizations can play a key role in making the various relationships transparent. This application paper is intended to inform the reader of the existence, functionality, and applicability of the toolbox. Results We present the DeltaProt Toolbox, a software toolbox that may be useful in importing, analyzing and visualizing data from multiple alignments of proteins. The toolbox has been written in MATLAB™ to provide an easy and user-friendly platform, including a graphical user interface, while ensuring good numerical performance. Problems in genome biology may be easily stated thanks to a compact input format. The toolbox also offers the possibility of utilizing structural information from the SABLE or other structure predictors. Different sequence plots can then be viewed and compared in order to find their similarities and differences. Detailed statistics are also calculated during the procedure. Conclusions The DeltaProt package is open source and freely available for academic, non-commercial use. The latest version of DeltaProt can be obtained from http://services.cbu.uib.no/software/deltaprot/. The website also contains documentation, and the toolbox comes with real data sets that are intended for training in applying the models to carry out bioinformatical and statistical analyses of protein sequences. Equipped with the new algorithms proposed here, DeltaProt serves as an auxiliary

  11. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  12. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  13. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    Science.gov (United States)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  14. A fractured rock geophysical toolbox method selection tool

    Science.gov (United States)

    Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.

    2016-01-01

    Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.

  15. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    Science.gov (United States)

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  16. D and D Toolbox Project - Technology Demonstration of Fixatives Applied to Hot Cell Facilities via Remote Sprayer Platforms

    International Nuclear Information System (INIS)

    Lagos, L.; Shoffner, P.; Espinosa, E.; Pena, G.; Kirk, P.; Conley, T.

    2009-01-01

    The objective of the US Department of Energy Office of Environmental Management's (DOE-EM's) D and D Toolbox Project is to use an integrated systems approach to develop a suite of decontamination and decommissioning (D and D) technologies, a D and D toolbox, that can be readily used across the DOE complex to improve safety, reduce technical risks, and limit uncertainty within D and D operations. Florida International University's Applied Research Center (FIU-ARC) is supporting this initiative by identifying technologies suitable to meet specific facility D and D requirements, assessing the readiness of those technologies for field deployment, and conducting technology demonstrations of selected technologies at FIU-ARC facilities in Miami, Florida. To meet the technology gap challenge for a technology to remotely apply strippable/fixative coatings, FIU-ARC identified and demonstrated of a remote fixative sprayer platform. During this process, FIU-ARC worked closely with the Oak Ridge National Laboratory in the selection of typical fixatives and in the design of a hot cell mockup facility for demonstrations at FIUARC. For this demonstration and for future demonstrations, FIU-ARC built a hot cell mockup facility at the FIU-ARC Technology Demonstration/Evaluation site in Miami, Florida. FIU-ARC selected the International Climbing Machines' (ICM's) Robotic Climber to perform this technology demonstration. The selected technology was demonstrated at the hot cell mockup facility at FIU-ARC during the week of November 10, 2008. Fixative products typically used inside hot cells were investigated and selected for this remote application. The fixatives tested included Sherwin Williams' Promar 200 and DTM paints and Bartlett's Polymeric Barrier System (PBS). The technology evaluation documented the ability of the remote system to spray fixative products on horizontal and vertical concrete surfaces. The technology performance, cost, and health and safety issues were evaluated

  17. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  18. Wave data processing toolbox manual

    Science.gov (United States)

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata

  19. ECO INVESTMENT PROJECT MANAGEMENT THROUGH TIME APPLYING ARTIFICIAL NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    Tamara Gvozdenović

    2007-06-01

    Full Text Available he concept of project management expresses an indispensable approach to investment projects. Time is often the most important factor in these projects. The artificial neural network is the paradigm of data processing, which is inspired by the one used by the biological brain, and it is used in numerous, different fields, among which is the project management. This research is oriented to application of artificial neural networks in managing time of investment project. The artificial neural networks are used to define the optimistic, the most probable and the pessimistic time in PERT method. The program package Matlab: Neural Network Toolbox is used in data simulation. The feed-forward back propagation network is chosen.

  20. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    Science.gov (United States)

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Cementitious Barriers Partnership (CBP): Using the CBP Software Toolbox to Simulate Sulfate Attack and Carbonation of Concrete Structures - 13481

    International Nuclear Information System (INIS)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S.; Flach, G.; Langton, C.; Smith, F.G.III; Burns, H.; Van der Sloot, H.; Meeussen, J.C.L.; Seignette, P.F.A.B.; Samson, E.; Mallick, P.; Suttora, L.; Esh, D.; Fuhrmann, M.; Philip, J.

    2013-01-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy Office of Tank Waste Management. The CBP project has developed a set of integrated modeling tools and leaching test methods to help improve understanding and prediction of the long-term hydraulic and chemical performance of cementitious materials used in nuclear applications. State-of-the-art modeling tools, including LeachXS TM /ORCHESTRA and STADIUM R , were selected for their demonstrated abilities to simulate reactive transport and degradation in cementitious materials. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF), now adopted as part of the SW-846 RCRA methods, have been used to help make the link between modeling and experiment. Although each of the CBP tools has demonstrated utility as a standalone product, coupling the models over relevant spatial and temporal solution domains can provide more accurate predictions of cementitious materials behavior over relevant periods of performance. The LeachXS TM /ORCHESTRA and STADIUM R models were first linked to the GoldSim Monte Carlo simulator to better and more easily characterize model uncertainties and as a means to coupling the models allowing linking to broader performance assessment evaluations that use CBP results for a source term. Two important degradation scenarios were selected for initial demonstration: sulfate ingress / attack and carbonation of cementitious materials. When sufficient sulfate is present in the pore solution external to a concrete barrier, sulfate can diffuse into the concrete, react with the concrete solid phases, and cause cracking that significantly changes the transport and structural properties of the concrete. The penetration of gaseous carbon dioxide within partially saturated concrete usually initiates a series of carbonation reactions with

  2. Cementitious Barriers Partnership (CBP): Using the CBP Software Toolbox to Simulate Sulfate Attack and Carbonation of Concrete Structures - 13481

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, School of Engineering, CRESP, Nashville, TN 37235 (United States); Flach, G.; Langton, C.; Smith, F.G.III; Burns, H. [Savannah River National Laboratory, Aiken, SC 29808 (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy, Dorpsstraat 216, 1721BV Langedijk (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Westerduinweg 3, Petten (Netherlands); Seignette, P.F.A.B. [Energy Research Center of The Netherlands, Petten (Netherlands); Samson, E. [SIMCO Technologies, Inc., Quebec (Canada); Mallick, P.; Suttora, L. [U.S. Department of Energy, Washington, DC (United States); Esh, D.; Fuhrmann, M.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy Office of Tank Waste Management. The CBP project has developed a set of integrated modeling tools and leaching test methods to help improve understanding and prediction of the long-term hydraulic and chemical performance of cementitious materials used in nuclear applications. State-of-the-art modeling tools, including LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R}, were selected for their demonstrated abilities to simulate reactive transport and degradation in cementitious materials. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF), now adopted as part of the SW-846 RCRA methods, have been used to help make the link between modeling and experiment. Although each of the CBP tools has demonstrated utility as a standalone product, coupling the models over relevant spatial and temporal solution domains can provide more accurate predictions of cementitious materials behavior over relevant periods of performance. The LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} models were first linked to the GoldSim Monte Carlo simulator to better and more easily characterize model uncertainties and as a means to coupling the models allowing linking to broader performance assessment evaluations that use CBP results for a source term. Two important degradation scenarios were selected for initial demonstration: sulfate ingress / attack and carbonation of cementitious materials. When sufficient sulfate is present in the pore solution external to a concrete barrier, sulfate can diffuse into the concrete, react with the concrete solid phases, and cause cracking that significantly changes the transport and structural properties of the concrete. The penetration of gaseous carbon dioxide within partially saturated concrete usually initiates a series of carbonation

  3. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Orfeo Toolbox: A Free And Open Source Solution For Research And Operational Remote Sensing Projects

    Science.gov (United States)

    Savinaud, Mickael; OTB-CS Team

    2013-12-01

    The free and open source solution, Orfeo ToolBox (OTB), offers the possibility to deal with large data processing. This library designed by CNES in the frame of the ORFEO accompaniment program to promote use of Pleiades data and other VHR data offers now a larger number of applications designed to end users. Due to its modular design, OTB is now used in different context from R&D studies to operational chain.

  5. MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.

    Science.gov (United States)

    McNair, Nicolas A

    2017-01-30

    To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    Science.gov (United States)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  7. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    Science.gov (United States)

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes

  8. Smoke Ready Toolbox for Wildfires

    Science.gov (United States)

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  9. Projective simulation for artificial intelligence

    Science.gov (United States)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  10. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    Science.gov (United States)

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  11. Rapid generation of control parameters of Multi-Infeed system through online simulation

    Directory of Open Access Journals (Sweden)

    Rasool Aazim

    2017-05-01

    Full Text Available Simulated Self-Generated - Particle Swarm optimization (SSG-PSO toolbox that automatically generates PI control parameters very quickly in PSCAD is designed. This toolbox operates by utilizing transient simulation to evaluate objective function and converges the fitness values of objective function through PSO algorithm during run time simulation of Multi-infeed HVDC systems. Integral Square Error-Objective Function (ISE-OF is used to accomplish the task. To make the toolbox faster, ranges are set for PSO generated value that limit the time of data acquisition for the objective function by only considering transition time of a system. This toolbox has a capability to optimize multiple controllers at same time. The PI values are generated faster and the results are showing markedly improved performance of a system during startup and under fault condition. The experimental results are presented in this paper.

  12. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    Science.gov (United States)

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  13. Integrated system dynamics toolbox for water resources planning.

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.; Peplinski, William J.; Tidwell, Vincent Carroll; Coursey, Don (University of Chicago, Chicago, IL); Hanson, Jason (University of New Mexico, Albuquerque, NM); Grimsrud, Kristine (University of New Mexico, Albuquerque, NM); Thacher, Jennifer (University of New Mexico, Albuquerque, NM); Broadbent, Craig (University of New Mexico, Albuquerque, NM); Brookshire, David (University of New Mexico, Albuquerque, NM); Chemak, Janie (University of New Mexico, Albuquerque, NM); Cockerill, Kristan (Cockeril Consulting, Boone, NC); Aragon, Carlos (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Hallett, Heather (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Vivoni, Enrique (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Roach, Jesse

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associated processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward

  14. 6 DOF Nonlinear AUV Simulation Toolbox

    Science.gov (United States)

    1997-01-01

    advantage of monitor is you can change the shared memory variables at any time. So it can be used for “ hardware in loop ” simulation. An editable monitor...for tuning controller. “ Hardware in loop ” simulation will be tested in recently. In the coming year, we will improve this simulation software. We

  15. Extend your toolbox with R

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    R seems to be ideal tool for visualising your data as well as practically all other data-related tasks. Learn how to start with R as it is worth to be included not only in statistician's or data scientist's toolboxes.

  16. SPATIAL DATA MINING TOOLBOX FOR MAPPING SUITABILITY OF LANDFILL SITES USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    S. K. M. Abujayyab

    2016-09-01

    Full Text Available Mapping the suitability of landfill sites is a complex field and is involved with multidiscipline. The purpose of this research is to create an ArcGIS spatial data mining toolbox for mapping the suitability of landfill sites at a regional scale using neural networks. The toolbox is constructed from six sub-tools to prepare, train, and process data. The employment of the toolbox is straightforward. The multilayer perceptron (MLP neural networks structure with a backpropagation learning algorithm is used. The dataset is mined from the north states in Malaysia. A total of 14 criteria are utilized to build the training dataset. The toolbox provides a platform for decision makers to implement neural networks for mapping the suitability of landfill sites in the ArcGIS environment. The result shows the ability of the toolbox to produce suitability maps for landfill sites.

  17. GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations

    Science.gov (United States)

    Antoine, Xavier; Duboscq, Romain

    2015-08-01

    GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.

  18. Overcoming challenges: The Koeberg simulator project

    International Nuclear Information System (INIS)

    Marin, J: Bruchental, S.; Lagerwall, J.R.

    2006-01-01

    As the nuclear power simulation training industry matures, the needs of the individual training facilities are becoming increasingly diverse. To respond to this trend, simulation vendors must offer a greater level of customization, and as such, are facing increasingly challenging projects. In 2002, CAE embarked on such a project for South Africas Koeberg nuclear power station. This simulator upgrade project, carried out in two phases, served as a unique exercise in product customization. In the first phase, CAE replaced the simulation servers, the software environment, and re-hosted the existing plant models. In the second phase, CAE replaced several of the existing thermal-hydraulics models to improve simulator fidelity. Throughout this project, CAE overcame a series of challenges stemming from project-specific requirements. In fact, the retention of the legacy software maintenance tools, the preservation of the instructor station package, and the interfacing with the existing hardware panel elements presented a number of opportunities for innovation. In the end, CAE overcame the challenges and acquired invaluable experience in doing so. (author)

  19. Wheat Rust Toolbox Related to New Initiatives on Yellow Rust

    DEFF Research Database (Denmark)

    Hansen, Jens Grønbech; Lassen, Poul

    ://www.fao.org/agriculture/crops/rust/stem/rust-report/en/). The Wheat rust toolbox is one of several International research platforms hosted by Aarhus University, and it uses the same ICT framework and databases as EuroWheat (www.eurowheat.org) and EuroBlight (www.EuroBlight.net). The Wheat Rust Toolbox will also serve the Global Rust Reference Centre (GRRC) as well...... – 2009), and as soon as possible this will be expanded to cover all global yellow rust data available via the GRRC. The presentation will focus on experiences from the previous work on global databases and web based information systems, as well as propose ideas how the toolbox can be helpful regarding...

  20. SCoT: a Python toolbox for EEG source connectivity.

    Science.gov (United States)

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  1. Development of a Twin-spool Turbofan Engine Simulation Using the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Johathan S.

    2014-01-01

    The Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3 of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  2. Insights into the European Years’ Communication Toolboxes

    Directory of Open Access Journals (Sweden)

    Camelia-Mihaela Cmeciu

    2012-08-01

    Full Text Available Since 1983 the European syntagm “unity in diversity” has been implemented in the European Years’ communication campaigns. Dependent on subsidiarity and decentralization, European Years focus on a specific issue which constitutes the subject of a year-long awareness campaign. Beyond the involvement of Europe’s citizens through their local, regional and national authorities in the implementation of the European Years’ policies, there is a unity at the level of the visual communication of the EU by two important image-building elements: EY logos and communication toolboxes. The European Years’ communication toolboxes can be considered signs of inclusion since every organization is expected to customize the templates in the official campaign design of the European Year. The analysis will focus on the image-building elements of three European Years (2010, 2011, 2012. Having social semiotics as the qualitative research method and the analytical framework based on the distinction between design resources and representational resources, I will analyze the double layers of the high intensity point of inclusion: (1 the European Years’ branding process; (2 the visual deontic modality within the visual guidelines of the EY communication toolbox.

  3. HYDRORECESSION: A toolbox for streamflow recession analysis

    Science.gov (United States)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  4. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  5. Cementitious Barriers Partnership (CBP): Training and Release of CBP Toolbox Software, Version 1.0 - 13480

    International Nuclear Information System (INIS)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S.; Flach, G.; Langton, C.; Smith, F.G. III; Burns, H.; Van der Sloot, H.; Meeussen, J.C.L.; Samson, E.; Mallick, P.; Suttora, L.; Esh, D.; Fuhrmann, M.; Philip, J.

    2013-01-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS TM /ORCHESTRA, STADIUM R , and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS TM /ORCHESTRA and STADIUM R codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed by discussions

  6. Automated simulation and study of spatial-structural design processes

    NARCIS (Netherlands)

    Davila Delgado, J.M.; Hofmeyer, H.; Stouffs, R.; Sariyildiz, S.

    2013-01-01

    A so-called "Design Process Investigation toolbox" (DPI toolbox), has been developed. It is a set of computational tools that simulate spatial-structural design processes. Its objectives are to study spatial-structural design processes and to support the involved actors. Two case-studies are

  7. Segmentation Toolbox for Tomographic Image Data

    DEFF Research Database (Denmark)

    Einarsdottir, Hildur

    , techniques to automatically analyze such data becomes ever more important. Most segmentation methods for large datasets, such as CT images, deal with simple thresholding techniques, where intensity values cut offs are predetermined and hard coded. For data where the intensity difference is not sufficient......Motivation: Image acquisition has vastly improved over the past years, introducing techniques such as X-ray computed tomography (CT). CT images provide the means to probe a sample non-invasively to investigate its inner structure. Given the wide usage of this technique and massive data amounts......, and partial volume voxels occur frequently, thresholding methods do not suffice and more advanced methods are required. Contribution: To meet these requirements a toolbox has been developed, combining well known methods within the image analysis field. The toolbox includes cluster-based methods...

  8. Project managing your simulator DCS upgrade

    International Nuclear Information System (INIS)

    Carr, S.

    2006-01-01

    The intention of this paper is to provide helpful information and tips for the purchaser with regard to the project management of a DCS upgrade for an existing nuclear power station operator-training simulator. This paper was written shortly after STS Powergen completed two nuclear power station simulator DCS projects in the USA. Areas covered by this paper are: - Contractual documents and arrangements; - Licence and Escrow agreements; - Liquidated damages; - Project management; - Project schedules and resources; - Monitoring progress; - Defect reporting; - DCS automation code; - Access to proprietary information; - Tips for project meetings; - Testing; - Cultural issues; - Training

  9. Drinking Water Cyanotoxin Risk Communication Toolbox

    Science.gov (United States)

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  10. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    Science.gov (United States)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  11. III. NIH TOOLBOX COGNITION BATTERY (CB): MEASURING EPISODIC MEMORY

    OpenAIRE

    Bauer, Patricia J.; Dikmen, Sureyya S.; Heaton, Robert K.; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L.

    2013-01-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are adminis...

  12. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  13. A toolbox for European judges

    NARCIS (Netherlands)

    Hesselink, M.W.

    2011-01-01

    The forthcoming instrument on European contract law, be it in the shape of an optional code for cross-border contracts or as an official toolbox for the European legislator, is likely to have a spill-over effect on private law adjudication in Europe. Judges will have no great difficulty in finding

  14. Project management in the library workplace

    CERN Document Server

    Daugherty, Alice

    2018-01-01

    This volume of Advances in Library Administration and Organization attempts to put project management into the toolboxes of library administrators through overviews of concepts, analyses of experiences, and forecasts for the use of project management within the profession.

  15. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    Science.gov (United States)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  16. GOCE user toolbox and tutorial

    DEFF Research Database (Denmark)

    Knudsen, Per; Benveniste, Jerome

    2011-01-01

    consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made...

  17. Development of a Twin-Spool Turbofan Engine Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3% of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  18. Sustainable knowledge development across cultural boundaries: Experiences from the EU-project SILMAS (Toolbox for conflict solving instruments in Alpine Lake Management)

    Science.gov (United States)

    Fegerl, Michael; Wieden, Wilfried

    2013-04-01

    Increasingly people have to communicate knowledge across cultural and language boundaries. Even though recent technologies offer powerful communication facilities people often feel confronted with barriers which clearly reduce their chances of making their interaction a success. Concrete evidence concerning such problems derives from a number of projects, where generated knowledge often results in dead-end products. In the Alpine Space-project SILMAS (Sustainable Instruments for Lake Management in Alpine Space), in which both authors were involved, a special approach (syneris® ) was taken to avoid this problem and to manage project knowledge in sustainable form. Under this approach knowledge input and output are handled interactively: Relevant knowledge can be developed continuously and users can always access the latest state of expertise. Resort to the respective tools and procedures can also assist in closing knowledge gaps and in developing innovative responses to familiar or novel problems. This contribution intends to describe possible ways and means which have been found to increase the chances of success of knowledge communication across cultural boundaries. The process of trans-cultural discussions of experts to find a standardized solution is highlighted as well as the problem of dissemination of expert knowledge to variant stakeholders. Finally lessons learned are made accessible, where a main task lies in the creation of a tool box for conflict solving instruments, as a demonstrable result of the project and for the time thereafter. The interactive web-based toolbox enables lake managers to access best practice instruments in standardized, explicit and cross-linguistic form.

  19. SCoT: A Python Toolbox for EEG Source Connectivity

    Directory of Open Access Journals (Sweden)

    Martin eBillinger

    2014-03-01

    Full Text Available Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG. Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs require single-trial estimation methods.In this paper, we present SCoT – a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with theMVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting.We demonstrate basic usage of SCoT on motor imagery (MI data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1 brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2 offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  20. Wind Turbine Blockset in Matlab/Simulink. General Overview and Description of the Model

    DEFF Research Database (Denmark)

    Iov, Florin; Hansen, A. D.; Soerensen, P.

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project ?Simulation Platform to model, optimize and design wind turbines? and it has been used as a general developer tool for other three simulation tools......: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some...

  1. Wind Turbine Blockset in Matlab/Simulink - General overview and description of the models

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project 'Simulation Platform to model, optimize and design wind turbines' and it has been used as a general developer tool for other three simulation tools: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some simulation results using the developed models are shown. Finally, some general conclusions regarding this new developed Toolbox as well as some directions for future work are made. (au)

  2. Wind turbine blockset in Matlab/Simulink. General overview and description of the models

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Timbus, A.V.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project 'Simulation Platform to model, optimize and design wind turbines' and it has been used as a general developer tool for other three simulation tools: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some simulation results using the developed models are shown. Finally, some general conclusions regarding this new developed Toolbox as well as some directions for future work are made. (au)

  3. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    Science.gov (United States)

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Aarle, Wim van, E-mail: wim.vanaarle@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Palenstijn, Willem Jan, E-mail: willemjan.palenstijn@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); De Beenhouwer, Jan, E-mail: jan.debeenhouwer@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Altantzis, Thomas, E-mail: thomas.altantzis@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Bals, Sara, E-mail: sara.bals@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Batenburg, K. Joost, E-mail: joost.batenburg@cwi.nl [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); Mathematical Institute, Leiden University, P.O. Box 9512, NL-2300 RA Leiden (Netherlands); Sijbers, Jan, E-mail: jan.sijbers@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-10-15

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series.

  5. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    Science.gov (United States)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  6. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    International Nuclear Information System (INIS)

    Aarle, Wim van; Palenstijn, Willem Jan; De Beenhouwer, Jan; Altantzis, Thomas; Bals, Sara; Batenburg, K. Joost; Sijbers, Jan

    2015-01-01

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series

  7. The laboratory test utilization management toolbox.

    Science.gov (United States)

    Baird, Geoffrey

    2014-01-01

    Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure.

  8. Key performance indicators for successful simulation projects

    OpenAIRE

    Jahangirian, M; Taylor, SJE; Young, T; Robinson, S

    2016-01-01

    There are many factors that may contribute to the successful delivery of a simulation project. To provide a structured approach to assessing the impact various factors have on project success, we propose a top-down framework whereby 15 Key Performance Indicators (KPI) are developed that represent the level of successfulness of simulation projects from various perspectives. They are linked to a set of Critical Success Factors (CSF) as reported in the simulation literature. A single measure cal...

  9. fMRI Artefact Rejection and Sleep Scoring Toolbox

    Directory of Open Access Journals (Sweden)

    Yves Leclercq

    2011-01-01

    Full Text Available We started writing the “fMRI artefact rejection and sleep scoring toolbox”, or “FAST”, to process our sleep EEG-fMRI data, that is, the simultaneous recording of electroencephalographic and functional magnetic resonance imaging data acquired while a subject is asleep. FAST tackles three crucial issues typical of this kind of data: (1 data manipulation (viewing, comparing, chunking, etc. of long continuous M/EEG recordings, (2 rejection of the fMRI-induced artefact in the EEG signal, and (3 manual sleep-scoring of the M/EEG recording. Currently, the toolbox can efficiently deal with these issues via a GUI, SPM8 batching system or hand-written script. The tools developed are, of course, also useful for other EEG applications, for example, involving simultaneous EEG-fMRI acquisition, continuous EEG eye-balling, and manipulation. Even though the toolbox was originally devised for EEG data, it will also gracefully handle MEG data without any problem. “FAST” is developed in Matlab as an add-on toolbox for SPM8 and, therefore, internally uses its SPM8-meeg data format. “FAST” is available for free, under the GNU-GPL.

  10. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  11. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    Science.gov (United States)

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  12. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    Science.gov (United States)

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  13. Evaluation Toolbox: Ex-Ante Impact Assessment and Value Network Analysis for SI

    NARCIS (Netherlands)

    Dhondt, S.; Ven, H. van de; Cressey, P.; Kaderabkova, A.; Luna, Á.; Moghadam Saman, S.; Castro Spila, J.; Ziauberyte, R.; Torre, W. van der; Terstriep, J.

    2016-01-01

    This report contains a toolbox for use with the Ex-Ante Impact Assessment for social innovations as was developed in the report D7.1. This toolbox proposes a series of convenient and useful tools to apply in an ex-ante assessment of social innovation within SIMPACT's policy areas unemployment,

  14. Fusion Simulation Project Workshop Report

    Science.gov (United States)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  15. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    Science.gov (United States)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  16. moviEEG: An animation toolbox for visualization of intracranial electroencephalography synchronization dynamics.

    Science.gov (United States)

    Wong, Simeon M; Ibrahim, George M; Ochi, Ayako; Otsubo, Hiroshi; Rutka, James T; Snead, O Carter; Doesburg, Sam M

    2016-06-01

    We introduce and describe the functions of moviEEG (Multiple Overlay Visualizations for Intracranial ElectroEncephaloGraphy), a novel MATLAB-based toolbox for spatiotemporal mapping of network synchronization dynamics in intracranial electroencephalography (iEEG) data. The toolbox integrates visualizations of inter-electrode phase-locking relationships in peri-ictal epileptogenic networks with signal spectral properties and graph-theoretical network measures overlaid upon operating room images of the electrode grid. Functional connectivity between every electrode pair is evaluated over a sliding window indexed by phase synchrony. Two case studies are presented to provide preliminary evidence for the application of the toolbox to guide network-based mapping of epileptogenic cortex and to distinguish these regions from eloquent brain networks. In both cases, epileptogenic cortex was visually distinct. We introduce moviEEG, a novel toolbox for animation of oscillatory network dynamics in iEEG data, and provide two case studies showing preliminary evidence for utility of the toolbox in delineating the epileptogenic zone. Despite evidence that atypical network synchronization has shown to be altered in epileptogenic brain regions, network based techniques have yet to be incorporated into clinical pre-surgical mapping. moviEEG provides a set of functions to enable easy visualization with network based techniques. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  17. CONTINEX: A Toolbox for Continuation in Experiments

    DEFF Research Database (Denmark)

    Schilder, Frank; Bureau, Emil; Santos, Ilmar

    2014-01-01

    CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation-free mod......CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation......-free models. The current version of the interface for experimental set-ups implements an algorithm for tuning control parameters, a robust noise-tolerant covering algorithm, and functions for monitoring (in)stability. In this talk we will report on experiments with an impact oscillator with magnetic actuators...

  18. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave

    Directory of Open Access Journals (Sweden)

    Ikaro Silva

    2014-09-01

    Full Text Available The WaveForm DataBase (WFDB Toolbox for MATLAB/Octave enables  integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox allows direct loading into MATLAB/Octave's workspace of over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by meta data such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  19. ObsPy - A Python Toolbox for Seismology - and Applications

    Science.gov (United States)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo

  20. Ubuntu Linux toolbox

    CERN Document Server

    Negus, Christopher

    2012-01-01

    This bestseller from Linux guru Chris Negus is packed with an array of new and revised material As a longstanding bestseller, Ubuntu Linux Toolbox has taught you how to get the most out Ubuntu, the world?s most popular Linux distribution. With this eagerly anticipated new edition, Christopher Negus returns with a host of new and expanded coverage on tools for managing file systems, ways to connect to networks, techniques for securing Ubuntu systems, and a look at the latest Long Term Support (LTS) release of Ubuntu, all aimed at getting you up and running with Ubuntu Linux quickly.

  1. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    Science.gov (United States)

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  2. Cementitious Barriers Partnership (CBP): Training and Release of CBP Toolbox Software, Version 1.0 - 13480

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, School of Engineering, CRESP, Nashville, TN 37235 (United States); Flach, G.; Langton, C.; Smith, F.G. III; Burns, H. [Savannah River National Laboratory, Aiken, SC 29808 (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy, Dorpsstraat 216, 1721BV Langedijk (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Westerduinweg 3, Petten (Netherlands); Samson, E. [SIMCO Technologies, Inc., Quebec (Canada); Mallick, P.; Suttora, L. [U.S. Department of Energy, Washington, DC (United States); Esh, D.; Fuhrmann, M.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS{sup TM}/ORCHESTRA, STADIUM{sup R}, and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed

  3. The RTMM Toolbox for DMM Applications

    DEFF Research Database (Denmark)

    Sharp, Robin; Todirica, Edward Alexandru

    2002-01-01

    This paper describes an approach to implementing distributed multimedia applications based on the use of a software toolbox. The tools in the box allow the designer to specify which components are to be used, how they are logically connected and what properties the streams of data to be passed...

  4. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    Science.gov (United States)

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  5. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    Science.gov (United States)

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  6. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    Science.gov (United States)

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  7. Fessenheim simulator for OECD Halden Reactor Project

    International Nuclear Information System (INIS)

    Oudot, G.; Bonnissent, B.

    1998-01-01

    A full scope NPP simulator is presently under manufacture by THOMSON TRAINING and SIMULATION (TTandS) in Cergy (France) for the OECD HALDEN REACTOR PROJECT. The reference plant of this simulator is the Fessenheim CP0 PWR power plant operated by the French utility EDF, for which TTandS has delivered a full scope training simulator in mid 1997. The simulator for HALDEN Reactor Project is based on a software duplication of the Fessenheim simulator delivered to EDF, ported on the most recent computers and O.S. available. This paper outlines the main features of this new simulator generation which reaps benefit of the advanced technologies of the SIPA design simulator introduced inside a full scope simulator. This kind of simulator is in fact the synthesis between training and design simulators and offers therefore added technical capabilities well suited to HALDEN needs. (author)

  8. A Total Factor Productivity Toolbox for MATLAB

    NARCIS (Netherlands)

    B.M. Balk (Bert); J. Barbero (Javier); J.L. Zofío (José)

    2018-01-01

    textabstractTotal Factor Productivity Toolbox is a new package for MATLAB that includes functions to calculate the main Total Factor Productivity (TFP) indices and their decompositions, based on Shephard’s distance functions and using Data Envelopment Analysis (DEA) programming techniques. The

  9. Virtual toolbox

    Science.gov (United States)

    Jacobus, Charles J.; Jacobus, Heidi N.; Mitchell, Brian T.; Riggs, A. J.; Taylor, Mark J.

    1993-04-01

    At least three of the five senses must be fully addressed in a successful virtual reality (VR) system. Sight, sound, and touch are the most critical elements for the creation of the illusion of presence. Since humans depend so much on sight to collect information about their environment, this area has been the focus of much of the prior art in virtual reality, however, it is also crucial that we provide facilities for force, torque, and touch reflection, and sound replay and 3-D localization. In this paper we present a sampling of hardware and software in the virtual environment maker's `toolbox' which can support rapidly building up of customized VR systems. We provide demonstrative examples of how some of the tools work and we speculate about VR applications and future technology needs.

  10. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  11. Solar cooling. Dynamic computer simulations and parameter variations; Solare Kuehlung. Dynamische Rechnersimulationen und Parametervariationen

    Energy Technology Data Exchange (ETDEWEB)

    Adam, Mario; Lohmann, Sandra [Fachhochschule Duesseldorf (Germany). E2 - Erneuerbare Energien und Energieeffizienz

    2011-05-15

    The research project 'Solar cooling in the Hardware-in-the-Loop-Test' is funded by the BMBF and deals with the modeling of a pilot plant for solar cooling with the 17.5 kW absorption chiller of Yazaki in the simulation environment of MATLAB/ Simulink with the toolboxes Stateflow and CARNOT. Dynamic simulations and parameter variations according to the work-efficient methodology of design of experiments are used to select meaningful system configurations, control strategies and dimensioning of the components. The results of these simulations will be presented and a view of the use of acquired knowledge for the planned laboratory field tests on a hardware-in-the-loop test stand will be given. (orig.)

  12. Accelerator Modeling with MATLAB Accelerator Toolbox

    International Nuclear Information System (INIS)

    2002-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model storage rings and beam transport lines in the MATLAB environment. The objective is to illustrate the flexibility and efficiency of the AT-MATLAB framework. The paper discusses three examples of problems that are analyzed frequently in connection with ring-based synchrotron light sources

  13. Distributed Aerodynamic Sensing and Processing Toolbox

    Science.gov (United States)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  14. PANDA: a pipeline toolbox for analyzing brain diffusion images

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2013-02-01

    Full Text Available Diffusion magnetic resonance imaging (dMRI is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named Pipeline for Analyzing braiN Diffusion imAges (PANDA for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL, Pipeline System for Octave and Matlab (PSOM, Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics (e.g., FA and MD that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI, allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  15. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    Science.gov (United States)

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  16. FIT3D toolbox: multiple view geometry and 3D reconstruction for Matlab

    NARCIS (Netherlands)

    Esteban, I.; Dijk, J.; Groen, F.

    2010-01-01

    FIT3D is a Toolbox built for Matlab that aims at unifying and distributing a set of tools that will allow the researcher to obtain a complete 3D model from a set of calibrated images. In this paper we motivate and present the structure of the toolbox in a tutorial and example based approach. Given

  17. Rad Toolbox User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, Keith F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sjoreen, Andrea L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2013-05-01

    The Radiological Toolbox software developed by Oak Ridge National Laboratory (ORNL) for U. S. Nuclear Regulatory Commission (NRC) is designed to provide electronic access to the vast and varied data that underlies the field of radiation protection. These data represent physical, chemical, anatomical, physiological, and mathematical parameters detailed in various handbooks which a health physicist might consult while in his office. The initial motivation for the software was to serve the needs of the health physicist away from his office and without access to his handbooks; e.g., NRC inspectors. The earlier releases of the software were widely used and accepted around the world by not only practicing health physicist but also those within educational programs. This release updates the software to accommodate changes in Windows operating systems and, in some aspects, radiation protection. This release has been tested on Windows 7 and 8 and on 32- and 64-bit machines. The nuclear decay data has been updated and thermal neutron capture cross sections and cancer risk coefficients have been included. This document and the software’s user’s guide provide further details and documentation of the information captured within the Radiological Toolbox.

  18. Neuro-QOL and the NIH Toolbox: implications for epilepsy

    Science.gov (United States)

    Nowinski, Cindy J; Victorson, David; Cavazos, Jose E; Gershon, Richard; Cella, David

    2011-01-01

    The impact of neurological disorders on the lives of patients is often far more complex than what is measured in routine examination. Measurement of this impact can be challenging owing to a lack of brief, psychometrically sound and generally accepted instruments. Two NIH-funded initiatives are developing assessment tools, in English and Spanish, which address these issues, and should prove useful to the study and treatment of epilepsy and other neurological conditions. The first, Neuro-QOL, has created a set of health-related quality of life measures that are applicable for people with common neurological disorders. The second, the NIH Toolbox for the Assessment of Neurological and Behavioral Function, is assembling measures of cognitive, emotional, motor and sensory health and function that can be used across all ages, from 3 to 85 years. This article describes both the projects and their potential value to epilepsy treatment and research. PMID:21552344

  19. The panacea toolbox of a PhD biomedical student.

    Science.gov (United States)

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  20. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  1. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  2. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    Science.gov (United States)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS

  3. TopoToolbox: using sensor topography to calculate psychologically meaningful measures from event-related EEG/MEG.

    Science.gov (United States)

    Tian, Xing; Poeppel, David; Huber, David E

    2011-01-01

    The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).

  4. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  5. Non-stationary Condition Monitoring of large diesel engines with the AEWATT toolbox

    DEFF Research Database (Denmark)

    Pontoppidan, Niels Henrik; Larsen, Jan; Sigurdsson, Sigurdur

    2005-01-01

    We are developing a specialized toolbox for non-stationary condition monitoring of large 2-stroke diesel engines based on acoustic emission measurements. The main contribution of this toolbox has so far been the utilization of adaptive linear models such as Principal and Independent Component Ana......, the inversion of those angular timing changes called “event alignment”, has allowed for condition monitoring across operation load settings, successfully enabling a single model to be used with realistic data under varying operational conditions-...

  6. EasyCloneYALI: CRISPR/Cas9-based synthetic toolbox for engineering of the yeast Yarrowia lipolytica

    DEFF Research Database (Denmark)

    Holkenbrink, Carina; Dam, Marie Inger; Kildegaard, Kanchana Rueksomtawin

    2018-01-01

    . Here, we present the EasyCloneYALI genetic toolbox, which allows streamlined strain construction with high genome editing efficiencies in Y. lipolytica via the CRISPR/Cas9 technology. The toolbox allows marker-free integration of gene expression vectors into characterized genome sites as well as marker......-free deletion of genes with the help of CRISPR/Cas9. Genome editing efficiencies above 80% were achieved with transformation protocols using non-replicating DNA repair fragments (such as DNA oligos). Furthermore, the toolbox includes a set of integrative gene expression vectors with prototrophic markers...

  7. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    Science.gov (United States)

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment

  8. Testing adaptive toolbox models: a Bayesian hierarchical approach

    NARCIS (Netherlands)

    Scheibehenne, B.; Rieskamp, J.; Wagenmakers, E.-J.

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often

  9. Development of an Ontology-Directed Signal Processing Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Stephen W. Lang

    2011-05-27

    This project was focused on the development of tools for the automatic configuration of signal processing systems. The goal is to develop tools that will be useful in a variety of Government and commercial areas and useable by people who are not signal processing experts. In order to get the most benefit from signal processing techniques, deep technical expertise is often required in order to select appropriate algorithms, combine them into a processing chain, and tune algorithm parameters for best performance on a specific problem. Therefore a significant benefit would result from the assembly of a toolbox of processing algorithms that has been selected for their effectiveness in a group of related problem areas, along with the means to allow people who are not signal processing experts to reliably select, combine, and tune these algorithms to solve specific problems. Defining a vocabulary for problem domain experts that is sufficiently expressive to drive the configuration of signal processing functions will allow the expertise of signal processing experts to be captured in rules for automated configuration. In order to test the feasibility of this approach, we addressed a lightning classification problem, which was proposed by DOE as a surrogate for problems encountered in nuclear nonproliferation data processing. We coded a toolbox of low-level signal processing algorithms for extracting features of RF waveforms, and demonstrated a prototype tool for screening data. We showed examples of using the tool for expediting the generation of ground-truth metadata, for training a signal recognizer, and for searching for signals with particular characteristics. The public benefits of this approach, if successful, will accrue to Government and commercial activities that face the same general problem - the development of sensor systems for complex environments. It will enable problem domain experts (e.g. analysts) to construct signal and image processing chains without

  10. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  11. Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox

    DEFF Research Database (Denmark)

    Casarin, Roberto; Grassi, Stefano; Ravazzolo, Francesco

    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights...... for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy...... times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications....

  12. Bohunice Simulator Data Collection Project

    International Nuclear Information System (INIS)

    Cillik, Ivan; Prochaska, Jan

    2002-01-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  13. Bohunice Simulator Data Collection Project

    International Nuclear Information System (INIS)

    Cillik, I.; Prochaska, J.

    2002-01-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project, training of V-2 control room crews was performed at VUJE Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for various operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  14. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  15. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Erickson, S.A.

    1984-01-01

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project

  16. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  17. Process evaluation of a Toolbox-training program for construction foremen in Denmark

    DEFF Research Database (Denmark)

    Jeschke, Katharina Christiane; Kines, Pete; Rasmussen, Liselotte

    2017-01-01

    foremen’s knowledge and communication skills in daily planning of work tasks and their related OSH risks on construction sites. The program builds on the popular 'toolbox meeting' concept, however there is very little research evaluating these types of meetings. This article describes the development...... and revised until the final version after the fifth group. The evaluation utilized an action research strategy with a mixed–methods approach of triangulating questionnaire, interview, and observation data. Process evaluation results showed that the eight Toolbox-training topics were relevant and useful...

  18. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    Science.gov (United States)

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  19. Project Schedule Simulation

    DEFF Research Database (Denmark)

    Mizouni, Rabeb; Lazarova-Molnar, Sanja

    2015-01-01

    overrun both their budget and time. To improve the quality of initial project plans, we show in this paper the importance of (1) reflecting features’ priorities/risk in task schedules and (2) considering uncertainties related to human factors in plan schedules. To make simulation tasks reflect features......’ priority as well as multimodal team allocation, enhanced project schedules (EPS), where remedial actions scenarios (RAS) are added, were introduced. They reflect potential schedule modifications in case of uncertainties and promote a dynamic sequencing of involved tasks rather than the static conventional...... this document as an instruction set. The electronic file of your paper will be formatted further at Journal of Software. Define all symbols used in the abstract. Do not cite references in the abstract. Do not delete the blank line immediately above the abstract; it sets the footnote at the bottom of this column....

  20. ARCTIS — A MATLAB® Toolbox for Archaeological Imaging Spectroscopy

    Directory of Open Access Journals (Sweden)

    Clement Atzberger

    2014-09-01

    Full Text Available Imaging spectroscopy acquires imagery in hundreds or more narrow contiguous spectral bands. This offers unprecedented information for archaeological research. To extract the maximum of useful archaeological information from it, however, a number of problems have to be solved. Major problems relate to data redundancy and the visualization of the large amount of data. This makes data mining approaches necessary, as well as efficient data visualization tools. Additional problems relate to data quality. Indeed, the upwelling electromagnetic radiation is recorded in small spectral bands that are only about ten nanometers wide. The signal received by the sensor is, thus quite low compared to sensor noise and possible atmospheric perturbations. The often small, instantaneous field of view (IFOV—essential for archaeologically relevant imaging spectrometer datasets—further limits the useful signal stemming from the ground. The combination of both effects makes radiometric smoothing techniques mandatory. The present study details the functionality of a MATLAB®-based toolbox, called ARCTIS (ARChaeological Toolbox for Imaging Spectroscopy, for filtering, enhancing, analyzing, and visualizing imaging spectrometer datasets. The toolbox addresses the above-mentioned problems. Its Graphical User Interface (GUI is designed to allow non-experts in remote sensing to extract a wealth of information from imaging spectroscopy for archaeological research. ARCTIS will be released under creative commons license, free of charge, via website (http://luftbildarchiv.univie.ac.at.

  1. An open-source toolbox for automated phenotyping of mice in behavioral tasks

    Directory of Open Access Journals (Sweden)

    Tapan P Patel

    2014-10-01

    Full Text Available Classifying behavior patterns in mouse models of neurological, psychiatric and neurodevelopmental disorders is critical for understanding disease causality and treatment. However, complete characterization of behavior is time-intensive, prone to subjective scoring, and often requires specialized equipment. Although several reports describe automated home-cage monitoring and individual task scoring methods, we report the first open source, comprehensive toolbox for automating the scoring of several common behavior tasks used by the neuroscience community. We show this new toolbox is robust and achieves equal or better consistency when compared to manual scoring methods. We use this toolbox to study the alterations in behavior that occur following blast-induced traumatic brain injury (bTBI, and study if these behavior patterns are altered following genetic deletion of the transcription factor Ets-like kinase 1 (Elk-1. Due to the role of Elk-1 in neuronal survival and proposed role in synaptic plasticity, we hypothesized that Elk-1 deletion would improve some neurobehavioral deficits, while impairing others, following blast exposure. In Elk-1 knockout animals, deficits in open field, spatial object recognition and elevated zero maze performance after blast exposure disappeared, while new significant deficits appeared in spatial and associative memory. These are the first data suggesting a molecular mediator of anxiety deficits following blast-induced traumatic brain injury, and represent the utility of the broad screening tool we developed. More broadly, we envision this open-source toolbox will provide a more consistent and rapid analysis of behavior across many neurological diseases, promoting the rapid discovery of novel pathways mediating disease progression and treatment.

  2. Geoplotlib: a Python Toolbox for Visualizing Geographical Data

    OpenAIRE

    Cuttone, Andrea; Lehmann, Sune; Larsen, Jakob Eg

    2016-01-01

    We introduce geoplotlib, an open-source python toolbox for visualizing geographical data. geoplotlib supports the development of hardware-accelerated interactive visualizations in pure python, and provides implementations of dot maps, kernel density estimation, spatial graphs, Voronoi tesselation, shapefiles and many more common spatial visualizations. We describe geoplotlib design, functionalities and use cases.

  3. 40 CFR 141.719 - Additional filtration toolbox components.

    Science.gov (United States)

    2010-07-01

    ... taken from a surface water or GWUDI source. A cap, such as GAC, on a single stage of filtration is not... separate stage of filtration if both filtration stages treat entire plant flow taken from a surface water... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Additional filtration toolbox...

  4. The SCAR project - accidental thermal-hydraulics: from the simulation to the simulators

    International Nuclear Information System (INIS)

    Farvacque, M.; Faydide, B.; Parent, M.; Iffenecker, F.; Pentori, B.; Dumas, J.M.

    2000-01-01

    The integration of the CATHARE code in the reactor simulators was completed in the beginning of the years 1990 with the design of the simulators SIPA1 and SIPA2. The SCAR project (Simulator CAthare Release), presented in this paper, is the following of this application. The objective is the adaptation of a reference CATHARE code version to the simulators environment, in order to realize the convergence between the safety analysis tool and the simulator. (A.L.B.)

  5. Wind turbine blockset in Saber. General overview and description of the models

    DEFF Research Database (Denmark)

    Iov, Florin; Timbus, Adrian Vasile; Hansen, Anca Daniela

    This report presents a new developed Saber Toolbox for wind turbine applications. This toolbox has been developed during the research projectSimulation Platform to model, optimize and design wind turbines”. The report provides a quick overview of the Saber and then explains the structure...... of this simulation package, which is different than other tools e.g. Matlab/Simulink. Then the structure of the toolbox is shown as well as the description of the developed models. The main focus here is to underline the special structure of the models, which are a mixture of Saber built-in blocks and new developed...... blocks. Since the developed models are based on Saber built-in blocks, a description of the libraries from Saber is given. Then some simulation results using the developed models are shown. Finally some general conclusions regarding this new developed Toolbox as well as some directions for future work...

  6. Wind Turbine Blockset in Saber. General Overview and Description of the Model

    DEFF Research Database (Denmark)

    Iov, Florin; Timbus, Adrian Vasile; Hansen, A. D.

    This report presents a new developed Saber Toolbox for wind turbine applications. This toolbox has been developed during the research project ?Simulation Platform to model, optimize and design wind turbines?. The report provides a quick overview of the Saber and then explains the structure...... of this simulation package, which is different than other tools e.g. Matlab/Simulink. Then the structure of the toolbox is shown as well as the description of the developed models. The main focus here is to underline the special structure of the models, which are a mixture of Saber built-in blocks and new developed...... blocks. Since the developed models are based on Saber built-in blocks, a description of the libraries from Saber is given. Then some simulation results using the developed models are shown. Finally some general conclusions regarding this new developed Toolbox as well as some directions for future work...

  7. Intelligent power plant simulator for educational purposes

    International Nuclear Information System (INIS)

    Seifi, A.; Seifi, H.; Ansari, M. R.; Parsa Moghaddam, M.

    2001-01-01

    An Intelligent Tutoring System can be effectively employed for a power plant simulator so that the need for instructor in minimized. In this paper using the above concept as well as object oriented programming and SIMULINK Toolbox of MATLAB, an intelligent tutoring power plant simulator is proposed. Its successful application on a typical 11 MW power plant is demonstrated

  8. Full scope upgrade project for the Fermi 2 simulator

    International Nuclear Information System (INIS)

    Bollacasa, D.; Gonsalves, J.B.; Newcomb, P.C.

    1994-01-01

    The Detroit Edison company (DECO) concentrated the Simulation Division of Asea Brown Boveri (ABB) to perform a full scope upgrade of the Fermi 2 simulator. The Fermi 2 plant is a BWR 6 generation Nuclear Steam Supply System (NSSS). The project included the complete replacement of the existing simulation model sofware with ABB's high fidelity BWR models, addition of an advanced instructor station facility and new simulation computers. Also provided on the project were ABB's advanced simulation environment (CETRAN), a comprehensive configuration management system based on a modern relational database system and a new computer interface to the input/output system. (8 refs., 2 figs.)

  9. COEL: A Cloud-based Reaction Network Simulator

    Directory of Open Access Journals (Sweden)

    Peter eBanda

    2016-04-01

    Full Text Available Chemical Reaction Networks (CRNs are a formalism to describe the macroscopic behavior of chemical systems. We introduce COEL, a web- and cloud-based CRN simulation framework that does not require a local installation, runs simulations on a large computational grid, provides reliable database storage, and offers a visually pleasing and intuitive user interface. We present an overview of the underlying software, the technologies, and the main architectural approaches employed. Some of COEL's key features include ODE-based simulations of CRNs and multicompartment reaction networks with rich interaction options, a built-in plotting engine, automatic DNA-strand displacement transformation and visualization, SBML/Octave/Matlab export, and a built-in genetic-algorithm-based optimization toolbox for rate constants.COEL is an open-source project hosted on GitHub (http://dx.doi.org/10.5281/zenodo.46544, which allows interested research groups to deploy it on their own sever. Regular users can simply use the web instance at no cost at http://coel-sim.org. The framework is ideally suited for a collaborative use in both research and education.

  10. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    Science.gov (United States)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  11. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    Science.gov (United States)

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  12. User Guide for Compressible Flow Toolbox Version 2.1 for Use With MATLAB(Registered Trademark); Version 7

    Science.gov (United States)

    Melcher, Kevin J.

    2006-01-01

    This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.

  13. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    Science.gov (United States)

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant

  14. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  15. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Science.gov (United States)

    2010-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Pre-filtration treatment toolbox...

  16. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    Science.gov (United States)

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  17. Tadarida: A Toolbox for Animal Detection on Acoustic Recordings

    Directory of Open Access Journals (Sweden)

    Yves Bas

    2017-02-01

    Full Text Available Passive Acoustic Monitoring (PAM recently extended to a very wide range of animals, but no available open software has been sufficiently generic to automatically treat several taxonomic groups. Here we present Tadarida, a software toolbox allowing for the detection and labelling of recorded sound events, and to classify any new acoustic data into known classes. It is made up of three modules handling Detection, Labelling and Classification and running on either Linux or Windows. This development resulted in the first open software (1 allowing generic sound event detection (multi-taxa, (2 providing graphical sound labelling at a single-instance level and (3 covering the whole process from sound detection to classification. This generic and modular design opens numerous reuse opportunities among (bioacoustics researchers, especially for those managing and/or developing PAM schemes. The whole toolbox is openly developed in C++ (Detection and Labelling and R (Classification and stored at https://github.com/YvesBas.

  18. SRV: an open-source toolbox to accelerate the recovery of metabolic biomarkers and correlations from metabolic phenotyping datasets.

    Science.gov (United States)

    Navratil, Vincent; Pontoizeau, Clément; Billoir, Elise; Blaise, Benjamin J

    2013-05-15

    Supervised multivariate statistical analyses are often required to analyze the high-density spectral information in metabolic datasets acquired from complex mixtures in metabolic phenotyping studies. Here we present an implementation of the SRV-Statistical Recoupling of Variables-algorithm as an open-source Matlab and GNU Octave toolbox. SRV allows the identification of similarity between consecutive variables resulting from the high-resolution bucketing. Similar variables are gathered to restore the spectral dependency within the datasets and identify metabolic NMR signals. The correlation and significance of these new NMR variables for a given effect under study can then be measured and represented on a loading plot to allow a visual and efficient identification of candidate biomarkers. Further on, correlations between these candidate biomarkers can be visualized on a two-dimensional pseudospectrum, representing a correlation map, helping to understand the modifications of the underlying metabolic network. SRV toolbox is encoded in MATLAB R2008A (Mathworks, Natick, MA) and in GNU Octave. It is available free of charge at http://www.prabi.fr/redmine/projects/srv/repository with a tutorial. benjamin.blaise@chu-lyon.fr or vincent.navratil@univ-lyon1.fr.

  19. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    Science.gov (United States)

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  20. The Matlab Radial Basis Function Toolbox

    Directory of Open Access Journals (Sweden)

    Scott A. Sarra

    2017-03-01

    Full Text Available Radial Basis Function (RBF methods are important tools for scattered data interpolation and for the solution of Partial Differential Equations in complexly shaped domains. The most straight forward approach used to evaluate the methods involves solving a linear system which is typically poorly conditioned. The Matlab Radial Basis Function toolbox features a regularization method for the ill-conditioned system, extended precision floating point arithmetic, and symmetry exploitation for the purpose of reducing flop counts of the associated numerical linear algebra algorithms.

  1. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  2. Sentinel-3 SAR Altimetry Toolbox - Scientific Exploitation of Operational Missions (SEOM) Program Element

    Science.gov (United States)

    Benveniste, Jérôme; Lucas, Bruno; Dinardo, Salvatore

    2014-05-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage pioneered by ERS-1, ERS-2, Envisat and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the Sentinel-3 series is planned for launch in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales, the French Space Agency), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as net

  3. Modelling Project Feasibility Robustness by Use of Scenarios

    DEFF Research Database (Denmark)

    Moshøj, Claus Rehfeld; Leleur, Steen

    1998-01-01

    , SEAM secures a consistent inclusion of actual scenario elements in the quantitative impact modelling and facilitates a transparent project feasibility robustness analysis. SEAM is implemented as part of a decision support system with a toolbox structure applicable to different types of transport...

  4. Projective Simulation compared to reinforcement learning

    OpenAIRE

    Bjerland, Øystein Førsund

    2015-01-01

    This thesis explores the model of projective simulation (PS), a novel approach for an artificial intelligence (AI) agent. The model of PS learns by interacting with the environment it is situated in, and allows for simulating actions before real action is taken. The action selection is based on a random walk through the episodic & compositional memory (ECM), which is a network of clips that represent previous experienced percepts. The network takes percepts as inpu...

  5. MTpy: A Python toolbox for magnetotellurics

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared R.

    2014-11-01

    We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  6. Imbalanced-learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning

    OpenAIRE

    Lemaitre , Guillaume; Nogueira , Fernando; Aridas , Christos ,

    2017-01-01

    International audience; imbalanced-learn is an open-source python toolbox aiming at providing a wide range of methods to cope with the problem of imbalanced dataset frequently encountered in machine learning and pattern recognition. The implemented state-of-the-art methods can be categorized into 4 groups: (i) under-sampling, (ii) over-sampling, (iii) combination of over-and under-sampling, and (iv) ensemble learning methods. The proposed toolbox depends only on numpy, scipy, and scikit-learn...

  7. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  8. Designed Green Toolbox as built environment educating method-analytical comparison between two groups of students with different cultural background

    NARCIS (Netherlands)

    El Fiky, U.; Hamdy, I.; Fikry, M.

    2006-01-01

    This paper is concerned with evaluating and testing the application process of green architecture design strategies using a tool-box as a built environment educating method and a pre-design reminder. Understanding the suggested green design strategies, testing the tool-box effectiveness,

  9. Kinematic simulation and analysis of robot based on MATLAB

    Science.gov (United States)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  10. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    1997-01-01

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  11. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  12. Robotic inspection technology-process an toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Hermes, Markus [ROSEN Group (United States). R and D Dept.

    2005-07-01

    Pipeline deterioration grows progressively with ultimate aging of pipeline systems (on-plot and cross country). This includes both, very localized corrosion as well as increasing failure probability due to fatigue cracking. Limiting regular inspecting activities to the 'scrapable' part of the pipelines only, will ultimately result into a pipeline system with questionable integrity. The confidence level in the integrity of these systems will drop below acceptance levels. Inspection of presently un-inspectable sections of the pipeline system becomes a must. This paper provides information on ROSEN's progress on the 'robotic inspection technology' project. The robotic inspection concept developed by ROSEN is based on a modular toolbox principle. This is mandatory. A universal 'all purpose' robot would not be reliable and efficient in resolving the postulated inspection task. A preparatory Quality Function Deployment (QFD) analysis is performed prior to the decision about the adequate robotic solution. This enhances the serviceability and efficiency of the provided technology. The word 'robotic' can be understood in its full meaning of Recognition - Strategy - Motion - Control. Cooperation of different individual systems with an established communication, e.g. utilizing Bluetooth technology, support the robustness of the ROSEN robotic inspection approach. Beside the navigation strategy, the inspection strategy is also part of the QFD process. Multiple inspection technologies combined on a single carrier or distributed across interacting container must be selected with a clear vision of the particular goal. (author)

  13. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  14. Expanding the UniFrac Toolbox.

    Directory of Open Access Journals (Sweden)

    Ruth G Wong

    Full Text Available The UniFrac distance metric is often used to separate groups in microbiome analysis, but requires a constant sequencing depth to work properly. Here we demonstrate that unweighted UniFrac is highly sensitive to rarefaction instance and to sequencing depth in uniform data sets with no clear structure or separation between groups. We show that this arises because of subcompositional effects. We introduce information UniFrac and ratio UniFrac, two new weightings that are not as sensitive to rarefaction and allow greater separation of outliers than classic unweighted and weighted UniFrac. With this expansion of the UniFrac toolbox, we hope to empower researchers to extract more varied information from their data.

  15. Participative simulation in the PSIM project

    NARCIS (Netherlands)

    Eijnatten, F.M. van; Vink, P.

    2002-01-01

    This chapter describes the background and objectives ofthe IST project Participative Simulation environment for Integral Manufacturing enterprise renewal (PSIM). In the short run, PSIM aims to address the key issues that have to be resolved before a manufacturing renewal can be implemented. PSIM is

  16. NIH Toolbox Cognitive Function Battery (CFB): Composite Scores of Crystallized, Fluid, and Overall Cognition

    Science.gov (United States)

    Akshoomoff, Natacha; Beaumont, Jennifer L.; Bauer, Patricia J.; Dikmen, Sureyya; Gershon, Richard; Mungas, Dan; Slotkin, Jerry; Tulsky, David; Weintraub, Sandra; Zelazzo, Philip; Heaton, Robert K.

    2014-01-01

    The NIH Toolbox Cognitive Function Battery (CFB) includes 7 tests covering 8 cognitive abilities considered to be important in adaptive functioning across the lifespan (from early childhood to late adulthood). Here we present data on psychometric characteristics in children (N = 208; ages 3–15 years) of a total summary score and composite scores reflecting two major types of cognitive abilities: “crystallized” (more dependent upon past learning experiences) and “fluid” (capacity for new learning and information processing in novel situations). Both types of cognition are considered important in everyday functioning, but are thought to be differently affected by brain health status throughout life, from early childhood through older adulthood. All three Toolbox composite scores showed excellent test-retest reliability, robust developmental effects across the childhood age range considered here, and strong correlations with established, “gold standard” measures of similar abilities. Additional preliminary evidence of validity includes significant associations between all three Toolbox composite scores and maternal reports of children’s health status and school performance. PMID:23952206

  17. Versatile Cas9-driven subpopulation selection toolbox for Lactococcus lactis

    NARCIS (Netherlands)

    Els, van der Simon; James, Jennelle K.; Kleerebezem, Michiel; Bron, Peter A.

    2018-01-01

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis. The Cas9 expression vector

  18. A web-based repository of surgical simulator projects.

    Science.gov (United States)

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  19. An open-source toolbox for multiphase flow in porous media

    Science.gov (United States)

    Horgue, P.; Soulaine, C.; Franc, J.; Guibert, R.; Debenest, G.

    2015-02-01

    Multiphase flow in porous media provides a wide range of applications: from the environmental understanding (aquifer, site-pollution) to industrial process improvements (oil production, waste management). Modeling of such flows involves specific volume-averaged equations and therefore specific computational fluid dynamics (CFD) tools. In this work, we develop a toolbox for modeling multiphase flow in porous media with OpenFOAM®, an open-source platform for CFD. The underlying idea of this approach is to provide an easily adaptable tool that can be used in further studies to test new mathematical models or numerical methods. The package provides the most common effective properties models of the literature (relative permeability, capillary pressure) and specific boundary conditions related to porous media flows. To validate this package, solvers based on the IMplicit Pressure Explicit Saturation (IMPES) method are developed in the toolbox. The numerical validation is performed by comparison with analytical solutions on academic cases. Then, a satisfactory parallel efficiency of the solver is shown on a more complex configuration.

  20. A toolbox and sample object perception data for equalization of natural images

    Directory of Open Access Journals (Sweden)

    Wilma A. Bainbridge

    2015-12-01

    Full Text Available For psychologists and neuroscientists, careful selection of their stimuli is essential, so that low-level visual features such as color or spatial frequency do not serve as confounds between conditions of interest. Here, we detail the Natural Image Statistical Toolbox, which allows scientists to measure, visualize, and control stimulus sets along a set of low-level visual properties. Additionally, we provide a set of object images varying along several perceptual object properties, including physical size and interaction envelope size (i.e., the space around an object transversed during an interaction, serving as a test-bed for the Natural Image Statistical Toolbox. This stimulus set is also a highly characterized set useful to psychology and neuroscience studies on object perception.

  1. ObsPy: A Python Toolbox for Seismology

    Science.gov (United States)

    Krischer, Lion; Megies, Tobias; Sales de Andrade, Elliott; Barsch, Robert; MacCarthy, Jonathan

    2017-04-01

    In recent years the Python ecosystem evolved into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community-driven, open-source project dedicated to providing a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than seven years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases. Additionally we will discuss the road ahead as well as the long-term sustainability of open-source scientific software.

  2. Outline of the earth simulator project

    International Nuclear Information System (INIS)

    Tani, Keiji

    2000-01-01

    The Science and Technology Agency of Japan has proposed a project to promote studies for global change prediction by an integrated three-in-one research and development approach: earth observation, basic research, and computer simulation. As part of the project, we are developing an ultra-fast computer, the 'Earth Simulator', with a sustained speed of more than 5 TFLOPS for an atmospheric circulation code. The 'Earth Simulator' is a MIMD type distributed memory parallel system in which 640 processor nodes are connected via fast single-stage crossbar network. Earth node consists of 8 vector-type arithmetic processors which are tightly connected via shared memory. The peak performance of the total system is 40 TFLOPS. As part of the development of basic software system, we are developing an operation supporting software system what is called a 'center routine'. We are going to use an archival system as a main storage of user files. Therefore, the most important function of the center routine is the optimal scheduling of not only submitted batch jobs but also user files necessary for them. All the design and R and D works for both hardware and basic software systems have been completed during the last three fiscal years, FY97, 98 and 99. The manufacture of the hardware system and the development of the center routine are underway. Facilities necessary for the Earth Simulator including buildings are also under construction. The total system will be completed in the spring of 2002. (author)

  3. Reinforcement Toolbox, a Parametric Reinforcement Modelling Tool for Curved Surface Structures

    NARCIS (Netherlands)

    Lauppe, J.; Rolvink, A.; Coenders, J.L.

    2013-01-01

    This paper presents a computational strategy and parametric modelling toolbox which aim at enhancing the design- and production process of reinforcement in freeform curved surface structures. The computational strategy encompasses the necessary steps of raising an architectural curved surface model

  4. Modeling and simulation of chillers with Dymola/Modelica; Modellierung und Simulation von Kaeltemaschinen mit Dymola/Modelica

    Energy Technology Data Exchange (ETDEWEB)

    Rettich, Daniel [Hochschule Biberach (Germany). Inst. fuer Gebaeude- und Energiesysteme (IGE)

    2012-07-01

    Within the contribution under consideration, a chiller was modeled and simulated with the program package Dymola / Modelica using the TIL Toolbox. An existing refrigeration technology test bench at the University of Biberach (Federal Republic of Germany) serves as a reference for the chiller illustrated in the simulation. The aim of the simulation is the future use of the models in a hardware-in-the-Loop (HIL) test bench in order to test different controllers with respect to their function and logic under identical framework conditions. Furthermore, the determination of the energy efficiency according to the regulation VDMA 24247 is in the foreground at the test bench as well as within the simulation. Following the final completion of the test bench, the models are validated against the test bench, and the model of the refrigerator will be connected to a detailed space model. Individual models were taken from the TIL toolbox, adapted for the application and parameterized with the design values of the laboratory chiller. Modifications to the TIL models were necessary in order to reflect the dynamic effects of the chiller in detail. For this purpose, investigations on indicators of the various dynamic components were employed. Subsequently to the modeling, each model was tested on the bases of design values and documents of the manufacturer. First simulation studies showed that the simulation in Dymola including the developed models provide plausible results. In the course of the modeling and parameterization of these modified models a component library was developed. Different models for future simulation studies can be extracted.

  5. Implementation of knowledge management approach in project oriented activities. Experience from simulator upgrade project

    International Nuclear Information System (INIS)

    Pironkov, L.

    2010-01-01

    Project specifics: Replacement of analogue process control system with Ovation®based Distributed Control System; Hybrid solution with simulated I&C logic and stimulated Human Machine Interface; Initial design ‐“turn‐key” project based on standard relationship “single customer –single contractor

  6. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    Science.gov (United States)

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost

  7. Genetic toolbox for controlled expression of functional proteins in Geobacillus spp.

    Directory of Open Access Journals (Sweden)

    Ivan Pogrebnyakov

    Full Text Available Species of genus Geobacillus are thermophilic bacteria and play an ever increasing role as hosts for biotechnological applications both in academia and industry. Here we screened a number of Geobacillus strains to determine which industrially relevant carbon sources they can utilize. One of the strains, G. thermoglucosidasius C56-YS93, was then chosen to develop a toolbox for controlled gene expression over a wide range of levels. It includes a library of semi-synthetic constitutive promoters (76-fold difference in expression levels and an inducible promoter from the xylA gene. A library of synthetic in silico designed ribosome binding sites was also created for further tuning of translation. The PxylA was further used to successfully express native and heterologous xylanases in G. thermoglucosidasius. This toolbox enables fine-tuning of gene expression in Geobacillus species for metabolic engineering approaches in production of biochemicals and heterologous proteins.

  8. Simulation of investment returns of toll projects.

    Science.gov (United States)

    2013-08-01

    This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...

  9. Using a Toolbox of Tailored Educational Lessons to Improve Fruit, Vegetable, and Physical Activity Behaviors among African American Women in California

    Science.gov (United States)

    Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.

    2011-01-01

    Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…

  10. A molecular toolbox for rapid generation of viral vectors to up- or down-regulate in vivo neuronal gene expression

    Directory of Open Access Journals (Sweden)

    Melanie D. White

    2011-07-01

    Full Text Available We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1 and Kir3.2 and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miR. We show that AAV assembled to express HCN1 miR produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miR with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience.

  11. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap.

    Science.gov (United States)

    Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W

    2013-07-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. a Matlab Toolbox for Basin Scale Fluid Flow Modeling Applied to Hydrology and Geothermal Energy

    Science.gov (United States)

    Alcanie, M.; Lupi, M.; Carrier, A.

    2017-12-01

    Recent boosts in the development of geothermal energy were fostered by the latest oil crises and by the need of reducing CO2 emissions generated by the combustion of fossil fuels. Various numerical codes (e.g. FEHM, CSMP++, HYDROTHERM, TOUGH) have thus been implemented for the simulation and quantification of fluid flow in the upper crust. One possible limitation of such codes is the limited accessibility and the complex structure of the simulators. For this reason, we began to develop a Hydrothermal Fluid Flow Matlab library as part of MRST (Matlab Reservoir Simulation Toolbox). MRST is designed for the simulation of oil and gas problems including carbon capture storage. However, a geothermal module is still missing. We selected the Geneva Basin as a natural laboratory because of the large amount of data available in the region. The Geneva Basin has been intensely investigated in the past with exploration wells, active seismic and gravity surveys. In addition, the energy strategy of Switzerland promotes the development of geothermal energy that lead to recent geophysical prospections. Previous and ongoing projects have shown the geothermal potential of the Geneva Basin but a consistent fluid flow model assessing the deep circulation in the region is yet to be defined. The first step of the study was to create the basin-scale static model. We integrated available active seismic, gravity inversions and borehole data to describe the principal geologic and tectonic features of the Geneva Basin. Petrophysical parameters were obtained from available and widespread well logs. This required adapting MRST to standard text format file imports and outline a new methodology for quick static model creation in an open source environment. We implemented several basin-scale fluid flow models to test the effects of petrophysical properties on the circulation dynamics of deep fluids in the Geneva Basin. Preliminary results allow the identification of preferential fluid flow

  13. Numerical simulation of induction heating thick-walled tubes

    Directory of Open Access Journals (Sweden)

    Lenhard Richard

    2018-01-01

    Full Text Available In the paper is shown the connection of two toolboxes in an Ansys Workbench solution for induction heating. In Ansys Workbench, Maxwell electromagnetism programs and Fluent have been linked. In Maxwell, a simulation of electromagnetic induction was performed, where data on the magnetic field distribution in the heated material was obtained and then transformed into the Fluent program in which the induction heating simulation was performed.

  14. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  15. Particle Swarm Optimization Toolbox

    Science.gov (United States)

    Grant, Michael J.

    2010-01-01

    The Particle Swarm Optimization Toolbox is a library of evolutionary optimization tools developed in the MATLAB environment. The algorithms contained in the library include a genetic algorithm (GA), a single-objective particle swarm optimizer (SOPSO), and a multi-objective particle swarm optimizer (MOPSO). Development focused on both the SOPSO and MOPSO. A GA was included mainly for comparison purposes, and the particle swarm optimizers appeared to perform better for a wide variety of optimization problems. All algorithms are capable of performing unconstrained and constrained optimization. The particle swarm optimizers are capable of performing single and multi-objective optimization. The SOPSO and MOPSO algorithms are based on swarming theory and bird-flocking patterns to search the trade space for the optimal solution or optimal trade in competing objectives. The MOPSO generates Pareto fronts for objectives that are in competition. A GA, based on Darwin evolutionary theory, is also included in the library. The GA consists of individuals that form a population in the design space. The population mates to form offspring at new locations in the design space. These offspring contain traits from both of the parents. The algorithm is based on this combination of traits from parents to hopefully provide an improved solution than either of the original parents. As the algorithm progresses, individuals that hold these optimal traits will emerge as the optimal solutions. Due to the generic design of all optimization algorithms, each algorithm interfaces with a user-supplied objective function. This function serves as a "black-box" to the optimizers in which the only purpose of this function is to evaluate solutions provided by the optimizers. Hence, the user-supplied function can be numerical simulations, analytical functions, etc., since the specific detail of this function is of no concern to the optimizer. These algorithms were originally developed to support entry

  16. Risk Consideration and Cost Estimation in Construction Projects Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Claudius A. Peleskei

    2015-06-01

    Full Text Available Construction projects usually involve high investments. It is, therefore, a risky adventure for companies as actual costs of construction projects nearly always exceed the planed scenario. This is due to the various risks and the large uncertainty existing within this industry. Determination and quantification of risks and their impact on project costs within the construction industry is described to be one of the most difficult areas. This paper analyses how the cost of construction projects can be estimated using Monte Carlo Simulation. It investigates if the different cost elements in a construction project follow a specific probability distribution. The research examines the effect of correlation between different project costs on the result of the Monte Carlo Simulation. The paper finds out that Monte Carlo Simulation can be a helpful tool for risk managers and can be used for cost estimation of construction projects. The research has shown that cost distributions are positively skewed and cost elements seem to have some interdependent relationships.

  17. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Directory of Open Access Journals (Sweden)

    Lucian A B Purvis

    Full Text Available In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  18. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Science.gov (United States)

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  19. NEMO: A Stellar Dynamics Toolbox

    Science.gov (United States)

    Barnes, Joshua; Hut, Piet; Teuben, Peter

    2010-10-01

    NEMO is an extendible Stellar Dynamics Toolbox, following an Open-Source Software model. It has various programs to create, integrate, analyze and visualize N-body and SPH like systems, following the pipe and filter architecture. In addition there are various tools to operate on images, tables and orbits, including FITS files to export/import to/from other astronomical data reduction packages. A large growing fraction of NEMO has been contributed by a growing list of authors. The source code consist of a little over 4000 files and a little under 1,000,000 lines of code and documentation, mostly C, and some C++ and Fortran. NEMO development started in 1986 in Princeton (USA) by Barnes, Hut and Teuben. See also ZENO (ascl:1102.027) for the version that Barnes maintains.

  20. Utilization of the simulators in I and C renewal project of Loviisa NPP

    International Nuclear Information System (INIS)

    Porkholm, K.; Ahonen, A.; Tiihonen, O.

    2006-01-01

    There are two VVER-440 type reactors in Loviisa Nuclear Power Plant. The first unit has been in operation since 1977 and the second since 1980. The availability of the plant as well as the operational experiences of the I and C systems are good. However it is obvious that the lifetime of the original I and C systems is not sufficient to guarantee the good availability of the plant in the future. Due to this fact a project for the renewal of the existing I and C systems has been started at Loviisa Nuclear Power Plant. In the project the analogue I and C systems will be renewed by digital I and C systems in four phases during 2005...2014. Simulators will be utilized extensively in the project to assure that the renewal of I and C systems can be realized safely and economically. An engineering simulator will be used in the design and validation of the modifications of the renewal I and C systems. A development simulator is aimed for the design, testing and acceptance of the new Man Machine Interface. A testing simulator will be used for the testing of the new I and C systems and retuning of the controllers mainly during the Factory Acceptance Tests. A training simulator will be used in training the operators and the other technical personnel in the operation of the new monitor-based control room facilities. All the simulators in the renewal project are based on APROS (Advanced PROcess Simulator) Simulation Software. Fortum Nuclear Services Ltd and the Technical Research Centre of Finland have developed APROS Simulation Software since 1986. APROS is a good example of the real multifunctional simulation software; i.e. it can be used in process and automation design, safety analysis and training simulator applications. APROS has been used extensively for various analysis and simulation tasks of the Loviisa Nuclear Power Plant in the past years. It has also been applied to various nuclear and thermal power plants elsewhere. First a short overview of Loviisa Nuclear Power

  1. Phasor Simulator for Operator Training Project

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, Jim [Electric Power Group, Llc, Pasadena, CA (United States)

    2016-09-14

    Synchrophasor systems are being deployed in power systems throughout the North American Power Grid and there are plans to integrate this technology and its associated tools into Independent System Operator (ISO)/utility control room operations. A pre-requisite to using synchrophasor technologies in control rooms is for operators to obtain training and understand how to use this technology in real-time situations. The Phasor Simulator for Operator Training (PSOT) project objective was to develop, deploy and demonstrate a pre-commercial training simulator for operators on the use of this technology and to promote acceptance of the technology in utility and ISO/Regional Transmission Owner (RTO) control centers.

  2. Monte Carlo simulation with the Gate software using grid computing

    International Nuclear Information System (INIS)

    Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.

    2009-03-01

    Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)

  3. IV. NIH Toolbox Cognition Battery (CB): measuring language (vocabulary comprehension and reading decoding).

    Science.gov (United States)

    Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra

    2013-08-01

    Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.

  4. Improving student comprehension of the interconnectivity of the hydrologic cycle with a novel 'hydrology toolbox', integrated watershed model, and companion textbook

    Science.gov (United States)

    Huning, L. S.; Margulis, S. A.

    2013-12-01

    Concepts in introductory hydrology courses are often taught in the context of process-based modeling that ultimately is integrated into a watershed model. In an effort to reduce the learning curve associated with applying hydrologic concepts to real-world applications, we developed and incorporated a 'hydrology toolbox' that complements a new, companion textbook into introductory undergraduate hydrology courses. The hydrology toolbox contains the basic building blocks (functions coded in MATLAB) for an integrated spatially-distributed watershed model that makes hydrologic topics (e.g. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) more user-friendly and accessible for students. The toolbox functions can be used in a modular format so that students can study individual hydrologic processes and become familiar with the hydrology toolbox. This approach allows such courses to emphasize understanding and application of hydrologic concepts rather than computer coding or programming. While topics in introductory hydrology courses are often introduced and taught independently or semi-independently, they are inherently interconnected. These toolbox functions are therefore linked together at the end of the course to reinforce a holistic understanding of how these hydrologic processes are measured, interconnected, and modeled. They are integrated into a spatially-distributed watershed model or numerical laboratory where students can explore a range of topics such as rainfall-runoff modeling, urbanization, deforestation, watershed response to changes in parameters or forcings, etc. Model output can readily be visualized and analyzed by students to understand watershed response in a real river basin or a simple 'toy' basin. These tools complement the textbook, each of which has been well received by students in multiple hydrology courses with various disciplinary backgrounds. The same governing equations that students have

  5. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Science.gov (United States)

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  6. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Directory of Open Access Journals (Sweden)

    Vernon Lawhern

    Full Text Available Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG data as an additional illustration.

  7. National Institutes of Health Toolbox Emotion Battery for English- and Spanish-speaking adults: normative data and factor-based summary scores.

    Science.gov (United States)

    Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K

    2018-01-01

    The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.

  8. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    Directory of Open Access Journals (Sweden)

    Rasmus Magnusson

    2017-06-01

    Full Text Available Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM, which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE for gene regulatory networks (GRNs. LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models

  9. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Directory of Open Access Journals (Sweden)

    David M. Zalk

    2011-06-01

    Conclusion: The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  10. Numerical simulation of the planar extrudate swell of pseudoplastic and viscoelastic fluids with the streamfunction and the VOF methods

    DEFF Research Database (Denmark)

    Comminal, Raphaël; Pimenta, Francisco; Hattel, Jesper H.

    2018-01-01

    , as well as with numerical simulations performed with the open-source rheoTool toolbox in OpenFOAM®. While the simulations of the generalized Newtonian fluids achieved mesh independence for all the methods tested, the flow simulations of the viscoelastic fluids are more sensitive to mesh refinement...

  11. SU-E-J-253: The Radiomics Toolbox in the Computational Environment for Radiological Research (CERR)

    Energy Technology Data Exchange (ETDEWEB)

    Apte, A; Veeraraghavan, H; Oh, J; Kijewski, P; Deasy, J [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2015-06-15

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features and (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.

  12. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    Science.gov (United States)

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  13. Trillo NPP full scope replica simulator project: The last great NPP simulation challenge in Spain

    International Nuclear Information System (INIS)

    Rivero, N.; Abascal, A.

    2006-01-01

    In the year 2000, Trillo NPP (Spanish PWR-KWU design nuclear power plant) and Tecnatom came to the agreement of developing a Trillo plant specific simulator, having as scope all the plant systems operated either from the main control room or from the emergency panels. The simulator operation should be carried out both through a control room replica and graphical user interface, this latter based on plant schematics and softpanels concept. Trillo simulator is to be primarily utilized as a pedagogical tool for the Trillo operational staff training. Because the engineering grade of the mathematical models, it will also have additional uses, such as: - Operation engineering (POE's validation, New Computerized Operator Support Systems Validation, etc).; - Emergency drills; -Plant design modifications assessment. This project has become the largest simulation task Tecnatom has ever undertaken, being structured in three different subprojects, namely: - Simulator manufacture, Simulator acceptance and Training material production. Most relevant technological innovations the project brings are: Highest accuracy in the Nuclear Island models, Advanced Configuration Management System, Open Software architecture, Human machine interface new design, Latest design I/O system and an Instructor Station with extended functionality. The Trillo simulator 'Ready for Training' event is due on September 2003, having started the Factory Acceptance Tests in Autumn 2002. (author)

  14. The GeantV project: preparing the future of simulation

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R L; Apostolakis, J; Brun, R; Carminati, F; Gheata, A; Nikitina, T; Novak, M; Pokorski, W; Shadura, O; Bandieramonte, M; Bhattacharyya, A; Mohanty, A; Seghal, R; Canal, Ph; Elvira, D; Lima, G; Duhem, L; De Fine Licht, J

    2015-01-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R and D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison. (paper)

  15. What's in Your Teaching Toolbox?

    Science.gov (United States)

    Mormer, Elaine

    2018-02-01

    Educators are faced with an array of tools available to enhance learning outcomes in the classroom and clinic. These tools range from those that are very simple to those that are sufficiently complex to require an investment in learning time. This article summarizes a collection of teaching tools, ordered by the time involved in learning proficient use. Simple tools described include specific online blogs providing support for faculty and student writing and a simple method to capture and download videos from YouTube for classroom use. More complex tools described include a Web-based application for custom-created animated videos and an interactive audience polling system. Readers are encouraged to reflect on the tools most appropriate for use in their own teaching toolbox by considering the requisite time to proficiency and suitability to address desired learner outcomes.

  16. HERMES: towards an integrated toolbox to characterize functional and effective brain connectivity.

    Science.gov (United States)

    Niso, Guiomar; Bruña, Ricardo; Pereda, Ernesto; Gutiérrez, Ricardo; Bajo, Ricardo; Maestú, Fernando; del-Pozo, Francisco

    2013-10-01

    The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the 'traditional' set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality.This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified-easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox.Here we present HERMES ( http://hermes.ctb.upm.es ), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.

  17. THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT

    International Nuclear Information System (INIS)

    Kim, Ji-hoon; Conroy, Charlie; Goldbaum, Nathan J.; Krumholz, Mark R.; Abel, Tom; Agertz, Oscar; Gnedin, Nickolay Y.; Kravtsov, Andrey V.; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Hummels, Cameron B.; Dekel, Avishai; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ∼100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ≅ 10 10 , 10 11 , 10 12 , and 10 13 M ☉ at z = 0 and two different ('violent' and 'quiescent') assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy 'metabolism'. The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M

  18. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    Science.gov (United States)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to

  19. Hanford tank waste operation simulator operational waste volume projection verification and validation procedure

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    The Hanford Tank Waste Operation Simulator is tested to determine if it can replace the FORTRAN-based Operational Waste Volume Projection computer simulation that has traditionally served to project double-shell tank utilization. Three Test Cases are used to compare the results of the two simulators; one incorporates the cleanup schedule of the Tri Party Agreement

  20. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    Science.gov (United States)

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  1. The Expanded Capabilities Of The Cementitious Barriers Partnership Software Toolbox Version 2.0 - 14331

    Energy Technology Data Exchange (ETDEWEB)

    Burns, Heather; Flach, Greg; Smith, Frank; Langton, Christine; Brown, Kevin; Kosson, David; Samson, Eric; Mallick, Pramod

    2014-01-10

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attack analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE

  2. Rangeland Brush Estimation Toolbox (RaBET): An Approach for Evaluating Brush Management Conservation Efforts in Western Grazing Lands

    Science.gov (United States)

    Holifield Collins, C.; Kautz, M. A.; Skirvin, S. M.; Metz, L. J.

    2016-12-01

    There are over 180 million hectares of rangelands and grazed forests in the central and western United States. Due to the loss of perennial grasses and subsequent increased runoff and erosion that can degrade the system, woody cover species cannot be allowed to proliferate unchecked. The USDA-Natural Resources Conservation Service (NRCS) has allocated extensive resources to employ brush management (removal) as a conservation practice to control woody species encroachment. The Rangeland-Conservation Effects Assessment Project (CEAP) has been tasked with determining how effective the practice has been, however their land managers lack a cost-effective means to conduct these assessments at the necessary scale. An ArcGIS toolbox for generating large-scale, Landsat-based, spatial maps of woody cover on grazing lands in the western United States was developed through a collaboration with NRCS Rangeland-CEAP. The toolbox contains two main components of operation, image generation and temporal analysis, and utilizes simple interfaces requiring minimum user inputs. The image generation tool utilizes geographically specific algorithms developed from combining moderate-resolution (30-m) Landsat imagery and high-resolution (1-m) National Agricultural Imagery Program (NAIP) aerial photography to produce the woody cover scenes at the Major Land Resource (MLRA) scale. The temporal analysis tool can be used on these scenes to assess treatment effectiveness and monitor woody cover reemergence. RaBET provides rangeland managers an operational, inexpensive decision support tool to aid in the application of brush removal treatments and assessing their effectiveness.

  3. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    Science.gov (United States)

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  4. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    Science.gov (United States)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  5. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    Science.gov (United States)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  6. Turbo-Satori: a neurofeedback and brain-computer interface toolbox for real-time functional near-infrared spectroscopy.

    Science.gov (United States)

    Lührs, Michael; Goebel, Rainer

    2017-10-01

    Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.

  7. Analysis of Time Delay Simulation in Networked Control System

    OpenAIRE

    Nyan Phyo Aung; Zaw Min Naing; Hla Myo Tun

    2016-01-01

    The paper presents a PD controller for the Networked Control Systems (NCS) with delay. The major challenges in this networked control system (NCS) are the delay of the data transmission throughout the communication network. The comparative performance analysis is carried out for different delays network medium. In this paper, simulation is carried out on Ac servo motor control system using CAN Bus as communication network medium. The True Time toolbox of MATLAB is used for simulation to analy...

  8. A Student Project to use Geant4 Simulations for a TMS-PET combination

    International Nuclear Information System (INIS)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-01-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing

  9. A Student Project to use Geant4 Simulations for a TMS-PET combination

    Science.gov (United States)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Rueda, A.; Solano Salinas, C. J.; Wahl, D.; Zamudio, A.

    2007-10-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  10. A model library for simulation and benchmarking of integrated urban wastewater systems

    DEFF Research Database (Denmark)

    Saagi, R.; Flores Alsina, Xavier; Kroll, J. S.

    2017-01-01

    This paper presents a freely distributed, open-source toolbox to predict the behaviour of urban wastewater systems (UWS). The proposed library is used to develop a system-wide Benchmark Simulation Model (BSM-UWS) for evaluating (local/global) control strategies in urban wastewater systems (UWS...

  11. Two Mechatronic Projects - an Agricultural Robot and a Flight Simulator Platform

    DEFF Research Database (Denmark)

    Sørensen, Torben; Fan, Zhun; Conrad, Finn

    2005-01-01

    and build in a one year Masters Thesis Project by two M.Sc. students (2400 hours in total). • The development of a portable platform for flight simulation has been initiated in a Mid-term project by two students (720 hours in total). In both of these projects the students started from scratch...

  12. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  13. M3D project for simulation studies of plasmas

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.; Sugiyama, L.E.

    1998-01-01

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes

  14. Simulated Annealing Genetic Algorithm Based Schedule Risk Management of IT Outsourcing Project

    Directory of Open Access Journals (Sweden)

    Fuqiang Lu

    2017-01-01

    Full Text Available IT outsourcing is an effective way to enhance the core competitiveness for many enterprises. But the schedule risk of IT outsourcing project may cause enormous economic loss to enterprise. In this paper, the Distributed Decision Making (DDM theory and the principal-agent theory are used to build a model for schedule risk management of IT outsourcing project. In addition, a hybrid algorithm combining simulated annealing (SA and genetic algorithm (GA is designed, namely, simulated annealing genetic algorithm (SAGA. The effect of the proposed model on the schedule risk management problem is analyzed in the simulation experiment. Meanwhile, the simulation results of the three algorithms GA, SA, and SAGA show that SAGA is the most superior one to the other two algorithms in terms of stability and convergence. Consequently, this paper provides the scientific quantitative proposal for the decision maker who needs to manage the schedule risk of IT outsourcing project.

  15. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    Science.gov (United States)

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  16. Risk Assessment in Financial Feasibility of Tanker Project Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Muhammad Badrus Zaman

    2017-09-01

    Full Text Available Every ship project would not be apart from risk and uncertainty issues. The inappropriate risk assessment process would have long-term impact, such as financial loss. Thus, risk and uncertainties analysis would be a very important process in financial feasibility determination of the project. This study analyzes the financial feasibility of 17,500 LTDW tanker project. Risk and uncertainty are two differentiated terminologies in this study, where risk focuses on operational risk due to shipbuilding process nonconformity to shipowner finance, while uncertainty focuses on variable costs that affect project cash flows. There are three funding scenarios in this study, where the percentage of funding with own capital and bank loan in scenario 1 is 100% : 0%, scenario 2 is 75% : 25%, and scenario 3 is 50% : 50%. Monte Carlo simulation method was applied to simulate the acceptance criteria, such as net present value (NPV, internal rate of return (IRR, payback period (PP, and profitability index (PI. The results of simulation show that 17,500 LTDW tanker project funding by scenario 1, 2 and 3 are feasible to run, where probability of each acceptance criteria was greater than 50%. Charter rate being the most sensitive uncertainty over project's financial feasibility parameters.

  17. An Educational Model for Hands-On Hydrology Education

    Science.gov (United States)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.

    2014-12-01

    This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.

  18. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  19. Interval Coded Scoring: a toolbox for interpretable scoring systems

    Directory of Open Access Journals (Sweden)

    Lieven Billiet

    2018-04-01

    Full Text Available Over the last decades, clinical decision support systems have been gaining importance. They help clinicians to make effective use of the overload of available information to obtain correct diagnoses and appropriate treatments. However, their power often comes at the cost of a black box model which cannot be interpreted easily. This interpretability is of paramount importance in a medical setting with regard to trust and (legal responsibility. In contrast, existing medical scoring systems are easy to understand and use, but they are often a simplified rule-of-thumb summary of previous medical experience rather than a well-founded system based on available data. Interval Coded Scoring (ICS connects these two approaches, exploiting the power of sparse optimization to derive scoring systems from training data. The presented toolbox interface makes this theory easily applicable to both small and large datasets. It contains two possible problem formulations based on linear programming or elastic net. Both allow to construct a model for a binary classification problem and establish risk profiles that can be used for future diagnosis. All of this requires only a few lines of code. ICS differs from standard machine learning through its model consisting of interpretable main effects and interactions. Furthermore, insertion of expert knowledge is possible because the training can be semi-automatic. This allows end users to make a trade-off between complexity and performance based on cross-validation results and expert knowledge. Additionally, the toolbox offers an accessible way to assess classification performance via accuracy and the ROC curve, whereas the calibration of the risk profile can be evaluated via a calibration curve. Finally, the colour-coded model visualization has particular appeal if one wants to apply ICS manually on new observations, as well as for validation by experts in the specific application domains. The validity and applicability

  20. Background simulation for the GENIUS project

    International Nuclear Information System (INIS)

    Ponkratenko, O.A.; Tretyak, V.I.; Zdesenko, Yu.G.

    1999-01-01

    The background simulations for the GENIUS experiment were performed with the help of GEANT 3.21 package and event generator DECAY 4.Contributions from the cosmogenic activity produced in the Ge detectors and from its radioactive impurities as well as from contamination of the liquid nitrogen and other materials were calculated.External gamma and neutron background were taking into consideration also.The results of calculations evidently show feasibility of the GENIUS project,which can substantially promote development of modern astroparticle physics

  1. Integrated Budget Office Toolbox

    Science.gov (United States)

    Rushing, Douglas A.; Blakeley, Chris; Chapman, Gerry; Robertson, Bill; Horton, Allison; Besser, Thomas; McCarthy, Debbie

    2010-01-01

    The Integrated Budget Office Toolbox (IBOT) combines budgeting, resource allocation, organizational funding, and reporting features in an automated, integrated tool that provides data from a single source for Johnson Space Center (JSC) personnel. Using a common interface, concurrent users can utilize the data without compromising its integrity. IBOT tracks planning changes and updates throughout the year using both phasing and POP-related (program-operating-plan-related) budget information for the current year, and up to six years out. Separating lump-sum funds received from HQ (Headquarters) into separate labor, travel, procurement, Center G&A (general & administrative), and servicepool categories, IBOT creates a script that significantly reduces manual input time. IBOT also manages the movement of travel and procurement funds down to the organizational level and, using its integrated funds management feature, helps better track funding at lower levels. Third-party software is used to create integrated reports in IBOT that can be generated for plans, actuals, funds received, and other combinations of data that are currently maintained in the centralized format. Based on Microsoft SQL, IBOT incorporates generic budget processes, is transportable, and is economical to deploy and support.

  2. Projections of Future Precipitation Extremes Over Europe: A Multimodel Assessment of Climate Simulations

    Science.gov (United States)

    Rajczak, Jan; Schär, Christoph

    2017-10-01

    Projections of precipitation and its extremes over the European continent are analyzed in an extensive multimodel ensemble of 12 and 50 km resolution EURO-CORDEX Regional Climate Models (RCMs) forced by RCP2.6, RCP4.5, and RCP8.5 (Representative Concentration Pathway) aerosol and greenhouse gas emission scenarios. A systematic intercomparison with ENSEMBLES RCMs is carried out, such that in total information is provided for an unprecedentedly large data set of 100 RCM simulations. An evaluation finds very reasonable skill for the EURO-CORDEX models in simulating temporal and geographical variations of (mean and heavy) precipitation at both horizontal resolutions. Heavy and extreme precipitation events are projected to intensify across most of Europe throughout the whole year. All considered models agree on a distinct intensification of extremes by often more than +20% in winter and fall and over central and northern Europe. A reduction of rainy days and mean precipitation in summer is simulated by a large majority of models in the Mediterranean area, but intermodel spread between the simulations is large. In central Europe and France during summer, models project decreases in precipitation but more intense heavy and extreme rainfalls. Comparison to previous RCM projections from ENSEMBLES reveals consistency but slight differences in summer, where reductions in southern European precipitation are not as pronounced as previously projected. The projected changes of the European hydrological cycle may have substantial impact on environmental and anthropogenic systems. In particular, the simulations indicate a rising probability of summertime drought in southern Europe and more frequent and intense heavy rainfall across all of Europe.

  3. A microfluidic toolbox for the development of in-situ product removal strategies in biocatalysis

    DEFF Research Database (Denmark)

    Heintz, Søren; Mitic, Aleksandar; Ringborg, Rolf Hoffmeyer

    2016-01-01

    A microfluidic toolbox for accelerated development of biocatalytic processes has great potential. This is especially the case for the development of advanced biocatalytic process concepts, where reactors and product separation methods are closely linked together to intensify the process performan...

  4. Activity modes selection for project crashing through deterministic simulation

    Directory of Open Access Journals (Sweden)

    Ashok Mohanty

    2011-12-01

    Full Text Available Purpose: The time-cost trade-off problem addressed by CPM-based analytical approaches, assume unlimited resources and the existence of a continuous time-cost function. However, given the discrete nature of most resources, the activities can often be crashed only stepwise. Activity crashing for discrete time-cost function is also known as the activity modes selection problem in the project management. This problem is known to be NP-hard. Sophisticated optimization techniques such as Dynamic Programming, Integer Programming, Genetic Algorithm, Ant Colony Optimization have been used for finding efficient solution to activity modes selection problem. The paper presents a simple method that can provide efficient solution to activity modes selection problem for project crashing.Design/methodology/approach: Simulation based method implemented on electronic spreadsheet to determine activity modes for project crashing. The method is illustrated with the help of an example.Findings: The paper shows that a simple approach based on simple heuristic and deterministic simulation can give good result comparable to sophisticated optimization techniques.Research limitations/implications: The simulation based crashing method presented in this paper is developed to return satisfactory solutions but not necessarily an optimal solution.Practical implications: The use of spreadsheets for solving the Management Science and Operations Research problems make the techniques more accessible to practitioners. Spreadsheets provide a natural interface for model building, are easy to use in terms of inputs, solutions and report generation, and allow users to perform what-if analysis.Originality/value: The paper presents the application of simulation implemented on a spreadsheet to determine efficient solution to discrete time cost tradeoff problem.

  5. AN APPROACH TO THE QUALITY IMPROVEMENT OF A MASSIVE INVESTMENT PROJECT BY INTEGRATING ICT AND QMS

    Directory of Open Access Journals (Sweden)

    Tamara Gvozdenovic

    2007-12-01

    Full Text Available This work has presented an approach to the quality improvement of an investment project by the change in the concept of project management. Building time of the investment project is a complex factor which needs a special attention. It is well known that the PERT method has been applied with long-lasting investment projects, where a big time distance brings about significant uncertainty of future situations. Microsoft Project 2002 and Matlab: Neural Network Toolbox are the software tools used for solving the problem of investment project management.

  6. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    Science.gov (United States)

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  7. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  8. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    Science.gov (United States)

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  9. Baseline process description for simulating plutonium oxide production for precalc project

    Energy Technology Data Exchange (ETDEWEB)

    Pike, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-26

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as well as process and facility design details necessary for multi-scale, multi-physics models are provided.

  10. Simulation of ODE/PDE models with MATLAB, OCTAVE and SCILAB scientific and engineering applications

    CERN Document Server

    Vande Wouwer, Alain; Vilas, Carlos

    2014-01-01

    Simulation of ODE/PDE Models with MATLAB®, OCTAVE and SCILAB shows the reader how to exploit a fuller array of numerical methods for the analysis of complex scientific and engineering systems than is conventionally employed. The book is dedicated to numerical simulation of distributed parameter systems described by mixed systems of algebraic equations, ordinary differential equations (ODEs) and partial differential equations (PDEs). Special attention is paid to the numerical method of lines (MOL), a popular approach to the solution of time-dependent PDEs, which proceeds in two basic steps: spatial discretization and time integration. Besides conventional finite-difference and -element techniques, more advanced spatial-approximation methods are examined in some detail, including nonoscillatory schemes and adaptive-grid approaches. A MOL toolbox has been developed within MATLAB®/OCTAVE/SCILAB. In addition to a set of spatial approximations and time integrators, this toolbox includes a collection of applicatio...

  11. MMM: A toolbox for integrative structure modeling.

    Science.gov (United States)

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  12. Simulation of OSCM Concepts for HQ SACT

    Science.gov (United States)

    2007-06-01

    effective method for creating understanding, identifying problems and developing solutions. • Simulation of a goal driven organization is a cost...effective method to visualize some aspects of the problem space Toolbox • The team used Extend™, a COTS product from Imagine That!® (http...Nations flow Model OSCM ATARES flow Batching A/C & Pallets Model ISAF Airbridge flow Flying and unbatching A/C Fleet Create resources Calculate flight

  13. Vision and Displays for Military and Security Applications The Advanced Deployable Day/Night Simulation Project

    CERN Document Server

    Niall, Keith K

    2010-01-01

    Vision and Displays for Military and Security Applications presents recent advances in projection technologies and associated simulation technologies for military and security applications. Specifically, this book covers night vision simulation, semi-automated methods in photogrammetry, and the development and evaluation of high-resolution laser projection technologies for simulation. Topics covered include: advances in high-resolution projection, advances in image generation, geographic modeling, and LIDAR imaging, as well as human factors research for daylight simulation and for night vision devices. This title is ideal for optical engineers, simulator users and manufacturers, geomatics specialists, human factors researchers, and for engineers working with high-resolution display systems. It describes leading-edge methods for human factors research, and it describes the manufacture and evaluation of ultra-high resolution displays to provide unprecedented pixel density in visual simulation.

  14. A part toolbox to tune genetic expression in Bacillus subtilis

    Science.gov (United States)

    Guiziou, Sarah; Sauveplane, Vincent; Chang, Hung-Ju; Clerté, Caroline; Declerck, Nathalie; Jules, Matthieu; Bonnet, Jerome

    2016-01-01

    Libraries of well-characterised components regulating gene expression levels are essential to many synthetic biology applications. While widely available for the Gram-negative model bacterium Escherichia coli, such libraries are lacking for the Gram-positive model Bacillus subtilis, a key organism for basic research and biotechnological applications. Here, we engineered a genetic toolbox comprising libraries of promoters, Ribosome Binding Sites (RBS), and protein degradation tags to precisely tune gene expression in B. subtilis. We first designed a modular Expression Operating Unit (EOU) facilitating parts assembly and modifications and providing a standard genetic context for gene circuits implementation. We then selected native, constitutive promoters of B. subtilis and efficient RBS sequences from which we engineered three promoters and three RBS sequence libraries exhibiting ∼14 000-fold dynamic range in gene expression levels. We also designed a collection of SsrA proteolysis tags of variable strength. Finally, by using fluorescence fluctuation methods coupled with two-photon microscopy, we quantified the absolute concentration of GFP in a subset of strains from the library. Our complete promoters and RBS sequences library comprising over 135 constructs enables tuning of GFP concentration over five orders of magnitude, from 0.05 to 700 μM. This toolbox of regulatory components will support many research and engineering applications in B. subtilis. PMID:27402159

  15. A Statistical Toolbox For Mining And Modeling Spatial Data

    Directory of Open Access Journals (Sweden)

    D’Aubigny Gérard

    2016-12-01

    Full Text Available Most data mining projects in spatial economics start with an evaluation of a set of attribute variables on a sample of spatial entities, looking for the existence and strength of spatial autocorrelation, based on the Moran’s and the Geary’s coefficients, the adequacy of which is rarely challenged, despite the fact that when reporting on their properties, many users seem likely to make mistakes and to foster confusion. My paper begins by a critical appraisal of the classical definition and rational of these indices. I argue that while intuitively founded, they are plagued by an inconsistency in their conception. Then, I propose a principled small change leading to corrected spatial autocorrelation coefficients, which strongly simplifies their relationship, and opens the way to an augmented toolbox of statistical methods of dimension reduction and data visualization, also useful for modeling purposes. A second section presents a formal framework, adapted from recent work in statistical learning, which gives theoretical support to our definition of corrected spatial autocorrelation coefficients. More specifically, the multivariate data mining methods presented here, are easily implementable on the existing (free software, yield methods useful to exploit the proposed corrections in spatial data analysis practice, and, from a mathematical point of view, whose asymptotic behavior, already studied in a series of papers by Belkin & Niyogi, suggests that they own qualities of robustness and a limited sensitivity to the Modifiable Areal Unit Problem (MAUP, valuable in exploratory spatial data analysis.

  16. Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox

    Science.gov (United States)

    Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.

    2017-10-01

    Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.

  17. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  18. 15 MW HArdware-in-the-loop Grid Simulation Project

    Energy Technology Data Exchange (ETDEWEB)

    Rigas, Nikolaos [Clemson Univ., SC (United States); Fox, John Curtiss [Clemson Univ., SC (United States); Collins, Randy [Clemson Univ., SC (United States); Tuten, James [Clemson Univ., SC (United States); Salem, Thomas [Clemson Univ., SC (United States); McKinney, Mark [Clemson Univ., SC (United States); Hadidi, Ramtin [Clemson Univ., SC (United States); Gislason, Benjamin [Clemson Univ., SC (United States); Boessneck, Eric [Clemson Univ., SC (United States); Leonard, Jesse [Clemson Univ., SC (United States)

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  19. PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control.

    Science.gov (United States)

    Eastman, Kyler M; Huk, Alexander C

    2012-01-01

    Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.

  20. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    Science.gov (United States)

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  1. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  2. Toolbox for Research and Exploration (TREX): Investigations of Fine-Grained Materials on Small Bodies

    Science.gov (United States)

    Domingue, D. L.; Allain, J.-P.; Banks, M.; Christoffersen, R.; Cintala, M.; Clark, R.; Cloutis, E.; Graps, A.; Hendrix, A. R.; Hsieh, H.; hide

    2018-01-01

    The Toolbox for Research and Exploration (TREX) is a NASA SSERVI (Solar System Exploration Research Virtual Institute) node. TREX (trex.psi.edu) aims to decrease risk to future missions, specifically to the Moon, the Martian moons, and near- Earth asteroids, by improving mission success and assuring the safety of astronauts, their instruments, and spacecraft. TREX studies will focus on characteristics of the fine grains that cover the surfaces of these target bodies - their spectral characteristics and the potential resources (such as H2O) they may harbor. TREX studies are organized into four Themes (Laboratory- Studies, Moon-Studies, Small-Bodies Studies, and Field-Work). In this presentation, we focus on the work targeted by the Small-Bodies Theme. The Small-Bodies' Theme delves into several topics, many which overlap or are synergistic with the other TREX Themes. The main topics include photometry, spectral modeling, laboratory simulations of space weathering processes relevant to asteroids, the assembly of an asteroid regolith database, the dichotomy between nuclear and reflectance spectroscopy, and the dynamical evolution of asteroids and the implications for the retention of volatiles.

  3. Linking Six Sigma to simulation: a new roadmap to improve the quality of patient care.

    Science.gov (United States)

    Celano, Giovanni; Costa, Antonio; Fichera, Sergio; Tringali, Giuseppe

    2012-01-01

    Improving the quality of patient care is a challenge that calls for a multidisciplinary approach, embedding a broad spectrum of knowledge and involving healthcare professionals from diverse backgrounds. The purpose of this paper is to present an innovative approach that implements discrete-event simulation (DES) as a decision-supporting tool in the management of Six Sigma quality improvement projects. A roadmap is designed to assist quality practitioners and health care professionals in the design and successful implementation of simulation models within the define-measure-analyse-design-verify (DMADV) or define-measure-analyse-improve-control (DMAIC) Six Sigma procedures. A case regarding the reorganisation of the flow of emergency patients affected by vertigo symptoms was developed in a large town hospital as a preliminary test of the roadmap. The positive feedback from professionals carrying out the project looks promising and encourages further roadmap testing in other clinical settings. The roadmap is a structured procedure that people involved in quality improvement can implement to manage projects based on the analysis and comparison of alternative scenarios. The role of Six Sigma philosophy in improvement of the quality of healthcare services is recognised both by researchers and by quality practitioners; discrete-event simulation models are commonly used to improve the key performance measures of patient care delivery. The two approaches are seldom referenced and implemented together; however, they could be successfully integrated to carry out quality improvement programs. This paper proposes an innovative approach to bridge the gap and enrich the Six Sigma toolbox of quality improvement procedures with DES.

  4. The BRAT and GUT Couple: Broadview Radar Altimetry and GOCE User Toolboxes

    Science.gov (United States)

    Benveniste, J.; Restano, M.; Ambrózio, A.

    2017-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's next release (4.2.0) is planned for October 2017. Based on the community feedback, the front-end has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.1 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's Variance-Covariance Matrix tool (VCM). BRAT and GUT toolboxes can be freely

  5. Evidence for a common toolbox based on necrotrophy in a fungal lineage spanning necrotrophs, biotrophs, endophytes, host generalists and specialists.

    Directory of Open Access Journals (Sweden)

    Marion Andrew

    Full Text Available The Sclerotiniaceae (Ascomycotina, Leotiomycetes is a relatively recently evolved lineage of necrotrophic host generalists, and necrotrophic or biotrophic host specialists, some latent or symptomless. We hypothesized that they inherited a basic toolbox of genes for plant symbiosis from their common ancestor. Maintenance and evolutionary diversification of symbiosis could require selection on toolbox genes or on timing and magnitude of gene expression. The genes studied were chosen because their products have been previously investigated as pathogenicity factors in the Sclerotiniaceae. They encode proteins associated with cell wall degradation: acid protease 1 (acp1, aspartyl protease (asps, and polygalacturonases (pg1, pg3, pg5, pg6, and the oxalic acid (OA pathway: a zinc finger transcription factor (pac1, and oxaloacetate acetylhydrolase (oah, catalyst in OA production, essential for full symptom production in Sclerotinia sclerotiorum. Site-specific likelihood analyses provided evidence for purifying selection in all 8 pathogenicity-related genes. Consistent with an evolutionary arms race model, positive selection was detected in 5 of 8 genes. Only generalists produced large, proliferating disease lesions on excised Arabidopsis thaliana leaves and oxalic acid by 72 hours in vitro. In planta expression of oah was 10-300 times greater among the necrotrophic host generalists than necrotrophic and biotrophic host specialists; pac1 was not differentially expressed. Ability to amplify 6/8 pathogenicity related genes and produce oxalic acid in all genera are consistent with the common toolbox hypothesis for this gene sample. That our data did not distinguish biotrophs from necrotrophs is consistent with 1 a common toolbox based on necrotrophy and 2 the most conservative interpretation of the 3-locus housekeeping gene phylogeny--a baseline of necrotrophy from which forms of biotrophy emerged at least twice. Early oah overexpression likely expands the

  6. CERN Summer Student Project Report – Simulation of the Micromegas Detector

    CERN Document Server

    Soares Ferreira Nunes Teixeira, Sofia Luisa

    2015-01-01

    My project during the Summer Student Programme at CERN consisted on simulations of the Micromegas (MM) detectors in order to test and characterize them in the presence of contamination by air of the gas mixture. The MM detectors were chosen for the upcoming upgrade of the ATLAS detector. The motivation for this project and the results obtained are here presented. Moreover, the work that should be carried out after this programme as a continuation of this project is also referred. To conclude, final considerations about the project are presented.

  7. Toolbox for the design of LiNbO3-based passive and active integrated quantum circuits

    Science.gov (United States)

    Sharapova, P. R.; Luo, K. H.; Herrmann, H.; Reichelt, M.; Meier, T.; Silberhorn, C.

    2017-12-01

    We present and discuss perspectives of current developments on advanced quantum optical circuits monolithically integrated in the lithium niobate platform. A set of basic components comprising photon pair sources based on parametric down conversion (PDC), passive routing elements and active electro-optically controllable switches and polarisation converters are building blocks of a toolbox which is the basis for a broad range of diverse quantum circuits. We review the state-of-the-art of these components and provide models that properly describe their performance in quantum circuits. As an example for applications of these models we discuss design issues for a circuit providing on-chip two-photon interference. The circuit comprises a PDC section for photon pair generation followed by an actively controllable modified mach-Zehnder structure for observing Hong-Ou-Mandel interference. The performance of such a chip is simulated theoretically by taking even imperfections of the properties of the individual components into account.

  8. Angra 1 nuclear power plant full scope simulator development project

    Energy Technology Data Exchange (ETDEWEB)

    Selvatici, Edmundo; Castanheira, Luiz Carlos C.; Silva Junior, Nilo Garcia da, E-mail: edsel@eletronuclear.gov.br, E-mail: lccast@eletronuclear.gov.br, E-mail: nilogar@eletronuclear.gov.br [Eletrobras Termonuclear S.A. (SCO/ELETRONUCLEAR), Angra dos Reis, RJ (Brazil). Superintendencia de Coordenacao da Operacao; Zazo, Francisco Javier Lopez; Ruiz, Jose Antonio, E-mail: jlopez@tecnatom.es, E-mail: jaruiz@tecnatom.es [Tecnatom S.A., San Sebastian de los Reyes, Madrid (Spain)

    2015-07-01

    Specific Full Scope Simulators are an essential tool for training NPP control room operators, in the formation phase as well as for maintaining their qualifications. In the last years availability of a Plant specific simulator has also become a Regulator requirement for Nuclear Power Plant operation. By providing real-time practical training for the operators, the use of a simulator allows improving the operator's performance, reducing the number of unplanned shutdowns and more effective response to abnormal and emergency operating conditions. It can also be used, among other uses, to validate procedures, test proposed plant modifications, perform engineering studies and to provide operation training for the technical support staff of the plant. The NPP site, in Angra dos Reis-RJ, Brazil, comprises the two units in operation, Unit 1, 640 MWe, Westinghouse PWR and Unit 2, 1350 MWe, KWU/Areva PWR and one unit in construction, Unit 3, 1405 MWe, KWU/Areva PWR, of the same design of Angra 2. Angra 2 has had its full scope simulator from the beginning, however this was not the case of Angra 1, that had to train its operators abroad, due to lack of a specific simulator. Eletronuclear participated in all the phases of the project, from data supply to commissioning and validation. The Angra 1 full scope simulator encompasses more than 80 systems of the plant including the Primary system, reactor core and associated auxiliary systems, the secondary system and turbo generator as well as all the Plant operational and safety I and C. The Angra 1 Main Control Room panels were reproduced in the simulator control room as well as the remote shutdown panels that are outside the control room. This paper describes the project for development of the Angra 1 NPP Full Scope Simulator, supplied by Tecnatom S.A., in the period of Feb.2012 to Feb.2015. (author)

  9. Angra 1 nuclear power plant full scope simulator development project

    International Nuclear Information System (INIS)

    Selvatici, Edmundo; Castanheira, Luiz Carlos C.; Silva Junior, Nilo Garcia da

    2015-01-01

    Specific Full Scope Simulators are an essential tool for training NPP control room operators, in the formation phase as well as for maintaining their qualifications. In the last years availability of a Plant specific simulator has also become a Regulator requirement for Nuclear Power Plant operation. By providing real-time practical training for the operators, the use of a simulator allows improving the operator's performance, reducing the number of unplanned shutdowns and more effective response to abnormal and emergency operating conditions. It can also be used, among other uses, to validate procedures, test proposed plant modifications, perform engineering studies and to provide operation training for the technical support staff of the plant. The NPP site, in Angra dos Reis-RJ, Brazil, comprises the two units in operation, Unit 1, 640 MWe, Westinghouse PWR and Unit 2, 1350 MWe, KWU/Areva PWR and one unit in construction, Unit 3, 1405 MWe, KWU/Areva PWR, of the same design of Angra 2. Angra 2 has had its full scope simulator from the beginning, however this was not the case of Angra 1, that had to train its operators abroad, due to lack of a specific simulator. Eletronuclear participated in all the phases of the project, from data supply to commissioning and validation. The Angra 1 full scope simulator encompasses more than 80 systems of the plant including the Primary system, reactor core and associated auxiliary systems, the secondary system and turbo generator as well as all the Plant operational and safety I and C. The Angra 1 Main Control Room panels were reproduced in the simulator control room as well as the remote shutdown panels that are outside the control room. This paper describes the project for development of the Angra 1 NPP Full Scope Simulator, supplied by Tecnatom S.A., in the period of Feb.2012 to Feb.2015. (author)

  10. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  11. FACET - a "Flexible Artifact Correction and Evaluation Toolbox" for concurrently recorded EEG/fMRI data.

    Science.gov (United States)

    Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S

    2013-11-09

    In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.

  12. An integrated GIS-MARKAL toolbox for designing a CO2 infrastructure network in the Netherlands

    NARCIS (Netherlands)

    van den Broek, M.A.; Brederode, E.; Ramirez, C.A.; Kramers, K.; van der Kuip, M.; Wildenborg, T.; Faaij, A.P.C.; Turkenburg, W.C.

    2009-01-01

    Large-scale implementation of carbon capture and storage needs a whole new infrastructure to transport and store CO2. Tools that can support planning and designing of such infrastructure require incorporation of both temporal and spatial aspects. Therefore, a toolbox that integrates ArcGIS, a

  13. Exploring International Investment through a Classroom Portfolio Simulation Project

    Science.gov (United States)

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  14. Modeling and Simulation of Project Management through the PMBOK® Standard Using Complex Networks

    Directory of Open Access Journals (Sweden)

    Luz Stella Cardona-Meza

    2017-01-01

    Full Text Available Discussion about project management, in both the academic literature and industry, is predominantly based on theories of control, many of which have been developed since the 1950s. However, issues arise when these ideas are applied unilaterally to all types of projects and in all contexts. In complex environments, management problems arise from assuming that results, predicted at the start of a project, can be sufficiently described and delivered as planned. Thus, once a project reaches a critical size, a calendar, and a certain level of ambiguity and interconnection, the analysis centered on control does not function adequately. Projects that involve complex situations can be described as adaptive complex systems, consistent in multiple interdependent dynamic components, multiple feedback processes, nonlinear relations, and management of hard data (process dynamics and soft data (executive team dynamics. In this study, through a complex network, the dynamic structure of a project and its trajectories are simulated using inference processes. Finally, some numerical simulations are described, leading to a decision making tool that identifies critical processes, thereby obtaining better performance outcomes of projects.

  15. CFD Simulations of Floating Point Absorber Wave Energy Converter Arrays Subjected to Regular Waves

    Directory of Open Access Journals (Sweden)

    Brecht Devolder

    2018-03-01

    Full Text Available In this paper we use the Computational Fluid Dynamics (CFD toolbox OpenFOAM to perform numerical simulations of multiple floating point absorber wave energy converters (WECs arranged in a geometrical array configuration inside a numerical wave tank (NWT. The two-phase Navier-Stokes fluid solver is coupled with a motion solver to simulate the hydrodynamic flow field around the WECs and the wave-induced rigid body heave motion of each WEC within the array. In this study, the numerical simulations of a single WEC unit are extended to multiple WECs and the complexity of modelling individual floating objects close to each other in an array layout is tackled. The NWT is validated for fluid-structure interaction (FSI simulations by using experimental measurements for an array of two, five and up to nine heaving WECs subjected to regular waves. The validation is achieved by using mathematical models to include frictional forces observed during the experimental tests. For all the simulations presented, a good agreement is found between the numerical and the experimental results for the WECs’ heave motions, the surge forces on the WECs and the perturbed wave field around the WECs. As a result, our coupled CFD–motion solver proves to be a suitable and accurate toolbox for the study of fluid-structure interaction problems of WEC arrays.

  16. Toolbox for uncertainty; Introduction of adaptive heuristics as strategies for project decision making

    DEFF Research Database (Denmark)

    Stingl, Verena; Geraldi, Joana

    2017-01-01

    This article presents adaptive heuristics as an alternative approach to navigate uncertainty in project decision-making. Adaptive heuristic are a class of simple decision strategies that have received only scant attention in project studies. Yet, they can strive in contexts of high uncertainty...... they are ‘ecologically rational’. The model builds on the individual definitions of ecological rationality and organizes them according to two types of uncertainty (‘knowable’ and ‘unknowable’). Decision problems and heuristics are furthermore grouped by decision task (choice and judgement). The article discusses...... and limited information, which are the typical project decision context. This article develops a conceptual model that supports a systematic connection between adaptive heuristics and project decisions. Individual adaptive heuristics succeed only in specific decision environments, in which...

  17. Performance of Hispanics and Non-Hispanic Whites on the NIH Toolbox Cognition Battery: the roles of ethnicity and language backgrounds.

    Science.gov (United States)

    Flores, Ilse; Casaletto, Kaitlin B; Marquine, Maria J; Umlauf, Anya; Moore, David J; Mungas, Dan; Gershon, Richard C; Beaumont, Jennifer L; Heaton, Robert K

    2017-05-01

    This study examined the influence of Hispanic ethnicity and language/cultural background on performance on the NIH Toolbox Cognition Battery (NIHTB-CB). Participants included healthy, primarily English-speaking Hispanic (n = 93; Hispanic-English), primarily Spanish-speaking Hispanic (n = 93; Hispanic-Spanish), and English speaking Non-Hispanic white (n = 93; NH white) adults matched on age, sex, and education levels. All participants were in the NIH Toolbox national norming project and completed the Fluid and Crystallized components of the NIHTB-CB. T-scores (demographically-unadjusted) were developed based on the current sample and were used in analyses. Spanish-speaking Hispanics performed worse than English-speaking Hispanics and NH whites on demographically unadjusted NIHTB-CB Fluid Composite scores (ps differences on tests of executive inhibitory control (p = .001), processing speed (p = .003), and working memory (p language/cultural backgrounds in the Hispanic-Spanish group: better vocabularies and reading were predicted by being born outside the U.S., having Spanish as a first language, attending school outside the U.S., and speaking more Spanish at home. However, many of these same background factors were associated with worse Fluid Composites within the Hispanic-Spanish group. On tests of Fluid cognition, the Hispanic-Spanish group performed the poorest of all groups. Socio-demographic and linguistic factors were associated with those differences. These findings highlight the importance of considering language/cultural backgrounds when interpreting neuropsychological test performances. Importantly, after applying previously published NIHTB-CB norms with demographic corrections, these language/ethnic group differences are eliminated.

  18. Modeling, Simulation and Position Control of 3DOF Articulated Manipulator

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2014-08-01

    Full Text Available In this paper, the modeling, simulation and control of 3 degrees of freedom articulated robotic manipulator have been studied. First, we extracted kinematics and dynamics equations of the mentioned manipulator by using the Lagrange method. In order to validate the analytical model of the manipulator we compared the model simulated in the simulation environment of Matlab with the model was simulated with the SimMechanics toolbox. A sample path has been designed for analyzing the tracking subject. The system has been linearized with feedback linearization and then a PID controller was applied to track a reference trajectory. Finally, the control results have been compared with a nonlinear PID controller.

  19. A simulation study on the focal plane detector of the LAUE project

    Science.gov (United States)

    Khalil, M.; Frontera, F.; Caroli, E.; Virgilli, E.; Valsan, V.

    2015-06-01

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium).

  20. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  1. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    Science.gov (United States)

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  2. PredPsych: A toolbox for predictive machine learning-based approach in experimental psychology research.

    Science.gov (United States)

    Koul, Atesh; Becchio, Cristina; Cavallo, Andrea

    2017-12-12

    Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.

  3. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Caroline Looms, Majken

    2013-01-01

    on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods...

  4. Expanding access to rheumatology care: the rheumatology general practice toolbox.

    LENUS (Irish Health Repository)

    Conway, R

    2015-02-01

    Management guidelines for many rheumatic diseases are published in specialty rheumatology literature but rarely in general medical journals. Musculoskeletal disorders comprise 14% of all consultations in primary care. Formal post-graduate training in rheumatology is limited or absent for many primary care practitioners. Primary care practitioners can be trained to effectively treat complex diseases and have expressed a preference for interactive educational courses. The Rheumatology General Practice (GP) Toolbox is an intensive one day course designed to offer up to date information to primary care practitioners on the latest diagnostic and treatment guidelines for seven common rheumatic diseases. The course structure involves a short lecture on each topic and workshops on arthrocentesis, joint injection and DXA interpretation. Participants evaluated their knowledge and educational experience before, during and after the course. Thirty-two primary care practitioners attended, who had a median of 13 (IQR 6.5, 20) years experience in their specialty. The median number of educational symposia attended in the previous 5 years was 10 (IQR-5, 22.5), with a median of 0 (IQR 0, 1) in rheumatology. All respondents agreed that the course format was appropriate. Numerical improvements were demonstrated in participant\\'s confidence in diagnosing and managing all seven common rheumatologic conditions, with statistically significant improvements (p < 0.05) in 11 of the 14 aspects assessed. The Rheumatology Toolbox is an effective educational method for disseminating current knowledge in rheumatology to primary care physicians and improved participant\\'s self-assessed competence in diagnosis and management of common rheumatic diseases.

  5. Development of a harmonized risk mitigation toolbox dedicated to environmental risks of pesticides in farmland in Europe: outcome of the MAgPIE workshop

    Directory of Open Access Journals (Sweden)

    Alix, A.

    2015-11-01

    Full Text Available Risk mitigation measures are a key component in designing conditions of use of pesticides in crop protection. A 2-step workshop was organized under the auspices of SETAC and the European Commission and gathered risk assessors and risk managers of 21 European countries, industry, academia and agronomical advisors/extension services, in order to provide European regulatory authorities with a toolbox of risk mitigation measures designed to reduce environmental risks of pesticides used in agriculture, and thus contribute to a better harmonization within Europe in the area. The workshop gathered an inventory of the risk mitigation tools for pesticides being implemented or in development in European countries. The inventory was discussed in order to identify the most promising tools for a harmonized toolbox in the European area. The discussions concerned the level of confidence in the technical data on which the tools identified rely, possible regulatory hurdles, expectations as regards the implementation of these tools by farmers and links with risk assessment. Finally, this workshop was a first step towards a network gathering all stakeholders, i.e. experts from national authorities, research sector, industry and farmers, to share information and further develop this toolbox. This paper presents an outline of the content of the toolbox with an emphasis on spray drift reducing techniques, in line with the discussions ongoing in the SPISE workshop.

  6. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    Directory of Open Access Journals (Sweden)

    M. S. Mizielinski

    2014-08-01

    Full Text Available The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3 atmosphere-only global climate simulations over the period 1985–2011, at resolutions of N512 (25 km, N216 (60 km and N96 (130 km as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe in 2012, with additional resources supplied by the Natural Environment Research Council (NERC and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS, and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  7. Contribution of Full-Scope Simulator Development Project to the Dissemination of Nuclear Knowledge within New-Build–Project Teams

    International Nuclear Information System (INIS)

    Gain, P.

    2016-01-01

    Full text: In a context where few countries recently carried out nuclear new-build projects combined with very strong need for generation renewal, there exists a major stake for the training of the hundreds of engineers who are involved in the design and commissioning teams of this highly complex industrial facility. The Simulator project, which gives the first opportunity for integration and validation of the whole of the design data, to check their coherence, the good performance with the interface and conformity with the safety and performance requirements, allows a fast and effective competence rise of all the resources involved in its development. In addition, the phased availability of the whole of data generally results in having several phased versions of the simulator. Each can then be deployed in great number for training drills which also will contribute to share in optimal way knowledge on the reference plant design. This contexts of broader use of these training modules and use of simulation in support of the engineering activities lead to their use by many teams with varied profiles; and there too, the simulation technologies are of a remarkable effectiveness to share a common and stable knowledge management. (author

  8. European Studies and Public Engagement: A Conceptual Toolbox

    Directory of Open Access Journals (Sweden)

    Andreas Müllerleile

    2014-11-01

    Full Text Available Journal of Contemporary European Research User Username Password Remember me Subscribe... Sign up for issue alerts Follow JCER on Twitter Font Size Make font size smaller Make font size default Make font size larger Journal Content Search Search Scope Browse By Issue By Author By Title Information For Readers For Authors For Librarians Journal Help Keywords CFSP Communication ESDP EU EU enlargement EU trade policy Energy, EU, External Policy Europe European Commission European Parliament European Union European integration Europeanisation First Enlargement Germany Liberty Lisbon Treaty Poland Russia Security teaching European studies The UACES Blog The Commission after the 2014 EP... Power shift? The EU’s pivot to Asia 100 Books on Europe to be Remembered For a Global European Studies? EU Member State Building in the... Open Journal Systems Home About Login Register Search Current Archives Announcements UACES Home > Vol 10, No 4 (2014 > Müllerleile European Studies and Public Engagement: A Conceptual Toolbox Andreas Müllerleile Abstract This article examines public engagement strategies for academics working in the field of European Studies. Should academics engage with the public? What are the most effective outreach strategies? And what are the implications for universities and departments? The article argues that engaging with the public should be considered an integral part for academics working on topics that relate to the European Union or European politics. The article has a theoretical and a practical dimension. The first part of the paper deals with the nature of public engagement, explaining why it is an important issue and how it differs from the mainstream understanding of public engagement. The practical part of the paper presents the idea of building an online presence through which academics can engage with the public debate both during periods of low issue salience and high issue salience. The final section includes a toolbox

  9. Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data

    Science.gov (United States)

    Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.

    2011-12-01

    Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them

  10. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eye-tracking experiments

    NARCIS (Netherlands)

    Dalmaijer, E.S.; Mathôt, S.; van der Stigchel, S.

    2014-01-01

    he PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and

  11. The AGORA High-resolution Galaxy Simulations Comparison Project

    OpenAIRE

    Kim Ji-hoon; Abel Tom; Agertz Oscar; Bryan Greg L.; Ceverino Daniel; Christensen Charlotte; Conroy Charlie; Dekel Avishai; Gnedin Nickolay Y.; Goldbaum Nathan J.; Guedes Javiera; Hahn Oliver; Hobbs Alexander; Hopkins Philip F.; Hummels Cameron B.

    2014-01-01

    The Astrophysical Journal Supplement Series 210.1 (2014): 14 reproduced by permission of the AAS We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ∼100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle o...

  12. Assessing the health status of managed honeybee colonies (HEALTHY-B): a toolbox to facilitate harmonised data collection

    DEFF Research Database (Denmark)

    Nielsen, Søren Saxmose

    2016-01-01

    Tools are provided to assess the health status of managed honeybee colonies by facilitating further harmonisation of data collection and reporting, design of field surveys across the European Union (EU) and analysis of data on bee health. The toolbox is based on characteristics of a healthy managed...... is very important when assessing its health status, but tools are currently lacking that could be used at apiary level in field surveys across the EU. Data on ‘beekeeping management practices’ and ‘environmental drivers’ can be collected via questionnaires and available databases, respectively....... Integrating multiple attributes of honeybee health, for instance, via a Health Status Index, is required to support a holistic assessment. Examples are provided on how the toolbox could be used by different stakeholders. Continued interaction between the Member State organisations, the EU Reference Laboratory...

  13. Par@Graph - a parallel toolbox for the construction and analysis of large complex climate networks

    NARCIS (Netherlands)

    Tantet, A.J.J.

    2015-01-01

    In this paper, we present Par@Graph, a software toolbox to reconstruct and analyze complex climate networks having a large number of nodes (up to at least 106) and edges (up to at least 1012). The key innovation is an efficient set of parallel software tools designed to leverage the inherited hybrid

  14. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    Science.gov (United States)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  15. SinCHet: a MATLAB toolbox for single cell heterogeneity analysis in cancer.

    Science.gov (United States)

    Li, Jiannong; Smalley, Inna; Schell, Michael J; Smalley, Keiran S M; Chen, Y Ann

    2017-09-15

    Single-cell technologies allow characterization of transcriptomes and epigenomes for individual cells under different conditions and provide unprecedented resolution for researchers to investigate cellular heterogeneity in cancer. The SinCHet ( gle ell erogeneity) toolbox is developed in MATLAB and has a graphical user interface (GUI) for visualization and user interaction. It analyzes both continuous (e.g. mRNA expression) and binary omics data (e.g. discretized methylation data). The toolbox does not only quantify cellular heterogeneity using S hannon P rofile (SP) at different clonal resolutions but also detects heterogeneity differences using a D statistic between two populations. It is defined as the area under the P rofile of S hannon D ifference (PSD). This flexible tool provides a default clonal resolution using the change point of PSD detected by multivariate adaptive regression splines model; it also allows user-defined clonal resolutions for further investigation. This tool provides insights into emerging or disappearing clones between conditions, and enables the prioritization of biomarkers for follow-up experiments based on heterogeneity or marker differences between and/or within cell populations. The SinCHet software is freely available for non-profit academic use. The source code, example datasets, and the compiled package are available at http://labpages2.moffitt.org/chen/software/ . ann.chen@moffitt.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. The Triticeae Toolbox: Combining Phenotype and Genotype Data to Advance Small-Grains Breeding

    Directory of Open Access Journals (Sweden)

    Victoria C. Blake

    2016-07-01

    Full Text Available The Triticeae Toolbox (; T3 is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP. T3 enables users to define specific data sets for download in formats compatible with the external tools TASSEL, Flapjack, and R; or to use by software residing on the T3 server for operations such as Genome Wide Association and Genomic Prediction. New T3 tools to assist plant breeders include a Selection Index Generator, analytical tools to compare phenotype trials using common or user-defined indices, and a histogram generator for nursery reports, with applications using the Android OS, and a Field Plot Layout Designer in development. Researchers using T3 will soon enjoy the ability to design training sets, define core germplasm sets, and perform multivariate analysis. An increased collaboration with GrainGenes and integration with the small grains reference sequence resources will place T3 in a pivotal role for on-the-fly data analysis, with instant access to the knowledge databases for wheat and barley. T3 software is available under the GNU General Public License and is freely downloadable.

  17. An optimization algorithm for simulation-based planning of low-income housing projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2010-10-01

    Full Text Available Construction of low-income housing projects is a replicated process and is associated with uncertainties that arise from the unavailability of resources. Government agencies and/or contractors have to select a construction system that meets low-income housing projects constraints including project conditions, technical, financial and time constraints. This research presents a framework, using computer simulation, which aids government authorities and contractors in the planning of low-income housing projects. The proposed framework estimates the time and cost required for the construction of low-income housing using pre-cast hollow core with hollow blocks bearing walls. Five main components constitute the proposed framework: a network builder module, a construction alternative selection module, a simulation module, an optimization module and a reporting module. An optimization module utilizing a genetic algorithm enables the defining of different options and ranges of parameters associated with low-income housing projects that influence the duration and total cost of the pre-cast hollow core with hollow blocks bearing walls method. A computer prototype, named LIHouse_Sim, was developed in MS Visual Basic 6.0 as proof of concept for the proposed framework. A numerical example is presented to demonstrate the use of the developed framework and to illustrate its essential features.

  18. A Simulation Platform To Model, Optimize And Design Wind Turbines. The Matlab/Simulink Toolbox

    Directory of Open Access Journals (Sweden)

    Anca Daniela HANSEN

    2002-12-01

    Full Text Available In the last years Matlab / Simulink® has become the most used software for modeling and simulation of dynamic systems. Wind energy conversion systems are for example such systems, containing subsystems with different ranges of the time constants: wind, turbine, generator, power electronics, transformer and grid. The electrical generator and the power converter need the smallest simulation step and therefore, these blocks decide the simulation speed. This paper presents a new and integrated simulation platform for modeling, optimizing and designing wind turbines. The platform contains different simulation tools: Matlab / Simulink - used as basic modeling tool, HAWC, DIgSilent and Saber.

  19. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions

    Directory of Open Access Journals (Sweden)

    Brian G. Danaher

    2015-03-01

    Full Text Available mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1 mobile device functionality, which represents the technological toolbox available to intervention developers; and (2 the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be able to better achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  20. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions.

    Science.gov (United States)

    Danaher, Brian G; Brendryen, Håvar; Seeley, John R; Tyler, Milagra S; Woolley, Tim

    2015-03-01

    mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  1. SPP Toolbox: Supporting Sustainable Public Procurement in the Context of Socio-Technical Transitions

    Directory of Open Access Journals (Sweden)

    Paula Cayolla Trindade

    2017-12-01

    Full Text Available Public procurement can shape production and consumption trends and represents a stimulus for both innovation and diversification in products and services, through a direct increase in demand. In recent years, the interest in demand-side policies has grown and several approaches have emerged, such as Green Public Procurement (GPP, Sustainable Public Procurement (SPP and Public Procurement of Innovation (PPI, representing strategic goals to be achieved through public procurement. In this context, there is a need to guide and support public organizations in the uptake of GPP, SPP and PPI practices. To respond to the challenges raised by the operationalization of such strategies, this paper proposes a new tool—the SPP Toolbox—for guiding public organizations as they re-think the procurement process, raising their ambitions and broadening their vision, thus changing the organizational approach towards culture, strategies, structures and practices. This toolbox integrates insights from GPP, SPP and PPI objectives and practices, in the context of the emergence of socio-technical transitions. The toolbox coherently links GPP, SPP and PPI, allowing flexibility in terms of goals, yet promoting an increasing complexity of institutionalized practices and skills—from GPP to SPP and then from SPP to PPI, organized in a framework fully integrated into the organizational strategy.

  2. Integration of Lead Discovery Tactics and the Evolution of the Lead Discovery Toolbox.

    Science.gov (United States)

    Leveridge, Melanie; Chung, Chun-Wa; Gross, Jeffrey W; Phelps, Christopher B; Green, Darren

    2018-06-01

    There has been much debate around the success rates of various screening strategies to identify starting points for drug discovery. Although high-throughput target-based and phenotypic screening has been the focus of this debate, techniques such as fragment screening, virtual screening, and DNA-encoded library screening are also increasingly reported as a source of new chemical equity. Here, we provide examples in which integration of more than one screening approach has improved the campaign outcome and discuss how strengths and weaknesses of various methods can be used to build a complementary toolbox of approaches, giving researchers the greatest probability of successfully identifying leads. Among others, we highlight case studies for receptor-interacting serine/threonine-protein kinase 1 and the bromo- and extra-terminal domain family of bromodomains. In each example, the unique insight or chemistries individual approaches provided are described, emphasizing the synergy of information obtained from the various tactics employed and the particular question each tactic was employed to answer. We conclude with a short prospective discussing how screening strategies are evolving, what this screening toolbox might look like in the future, how to maximize success through integration of multiple tactics, and scenarios that drive selection of one combination of tactics over another.

  3. Process Evaluation of a Toolbox-training Program for Construction Foremen in Denmark

    DEFF Research Database (Denmark)

    Jeschke, Katharina Christiane; Kines, Pete; Rasmussen, Liselotte

    2017-01-01

    for the majority of the foremen, who experienced positive changes in their daily work methods and interactions with their crews, colleagues, leaders, customers and other construction professions. The program is a unique contribution to leadership training in the construction industry, and can potentially......Daily dialogue between leaders and workers on traditional construction sites is primarily focused on production, quality and time issues, and rarely involves occupational safety and health (OSH) issues. A leadership training program entitled 'Toolbox-training' was developed to improve construction...

  4. Modeling, Simulation and Position Control of 3 Degree of Freedom Articulated Manipulator

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2013-09-01

    Full Text Available In this paper, the modeling, simulation and control of 3 degree of freedom articulated robotic manipulator have been studied. First, we extracted kinematics and dynamics equations of the mentioned manipulator by using the Lagrange method. In order to validate the analytical model of the manipulator we compared the model simulated in the simulation environment of Matlab with the model was simulated with the SimMechanics toolbox. A sample path has been designed for analyzing the tracking subject. The system has been linearized with feedback linearization and then a PID controller was applied to track a reference trajectory. Finally, the control results have been compared with a nonlinear PID controller.

  5. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe s how students use a project management simulation game based on an attack‑defense mechanism where two teams of players compete by challenging each other⠒s projects. The project management simulation game is intended to be playe d by pre‑service construction workers and e...

  6. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other

  7. Towards the petaflop for Lattice QCD simulations the PetaQCD project

    International Nuclear Information System (INIS)

    D'Auriac, Jean-Christian Angles; Carbonell, Jaume; Barthou, Denis

    2010-01-01

    The study and design of a very ambitious petaflop cluster exclusively dedicated to Lattice QCD simulations started in early '08 among a consortium of 7 laboratories (IN2P3, CNRS, INRIA, CEA) and 2 SMEs. This consortium received a grant from the French ANR agency in July '08, and the PetaQCD project kickoff took place in January '09. Building upon several years of fruitful collaborative studies in this area, the aim of this project is to demonstrate that the simulation of a 256 x 128 3 lattice can be achieved through the HMC/ETMC software, using a machine with efficient speed/cost/reliability/power consumption ratios. It is expected that this machine can be built out of a rather limited number of processors (e.g. between 1000 and 4000), although capable of a sustained petaflop CPU performance. The proof-of-concept should be a mock-up cluster built as much as possible with off-the-shelf components, and 2 particularly attractive axis will be mainly investigated, in addition to fast all-purpose multi-core processors: the use of the new brand of IBM-Cell processors (with on-chip accelerators) and the very recent Nvidia GP-GPUs (off-chip co-processors). This cluster will obviously be massively parallel, and heterogeneous. Communication issues between processors, implied by the Physics of the simulation and the lattice partitioning, will certainly be a major key to the project.

  8. Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox

    Science.gov (United States)

    Jackson, P. Ryan

    2013-01-01

    The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.

  9. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Science.gov (United States)

    Swales, Dustin J.; Pincus, Robert; Bodas-Salcedo, Alejandro

    2018-01-01

    The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP) gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  10. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    Science.gov (United States)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  11. DYNAMIC SYSTEM ANALYSIS WITH pplane8.m (MATLAB® toolbox

    Directory of Open Access Journals (Sweden)

    Alejandro Regalado-Méndez

    2013-12-01

    Full Text Available In this work, four dynamic systems were analyzed (physical, chemical, ecological and economical, represented by autonomous systems of two ordinary differential equations. Main objective is proving that pplane8.m is an adequate and efficient computer tool. The analysis of autonomous systems was given by characterization of their critical points according to Coughanowr & LeBlanc (2009, with the MATLAB® toolbox pplane8.m. The main results are that pplane8.m (Polking, 2009 can quickly and precisely draw trajectories of each phase plane, it easily computes each critical point, and correctly characterize each equilibrium point of all autonomous studied systems. Finally, we can say that pplane8.m is a powerful tool to help the teaching-learning process for engineering students.

  12. Peptide chemistry toolbox - Transforming natural peptides into peptide therapeutics.

    Science.gov (United States)

    Erak, Miloš; Bellmann-Sickert, Kathrin; Els-Heindl, Sylvia; Beck-Sickinger, Annette G

    2018-06-01

    The development of solid phase peptide synthesis has released tremendous opportunities for using synthetic peptides in medicinal applications. In the last decades, peptide therapeutics became an emerging market in pharmaceutical industry. The need for synthetic strategies in order to improve peptidic properties, such as longer half-life, higher bioavailability, increased potency and efficiency is accordingly rising. In this mini-review, we present a toolbox of modifications in peptide chemistry for overcoming the main drawbacks during the transition from natural peptides to peptide therapeutics. Modifications at the level of the peptide backbone, amino acid side chains and higher orders of structures are described. Furthermore, we are discussing the future of peptide therapeutics development and their impact on the pharmaceutical market. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Development of a broad-host synthetic biology toolbox for Ralstonia eutropha and its application to engineering hydrocarbon biofuel production.

    Science.gov (United States)

    Bi, Changhao; Su, Peter; Müller, Jana; Yeh, Yi-Chun; Chhabra, Swapnil R; Beller, Harry R; Singer, Steven W; Hillson, Nathan J

    2013-11-13

    The chemoautotrophic bacterium Ralstonia eutropha can utilize H2/CO2 for growth under aerobic conditions. While this microbial host has great potential to be engineered to produce desired compounds (beyond polyhydroxybutyrate) directly from CO2, little work has been done to develop genetic part libraries to enable such endeavors. We report the development of a toolbox for the metabolic engineering of Ralstonia eutropha H16. We have constructed a set of broad-host-range plasmids bearing a variety of origins of replication, promoters, 5' mRNA stem-loop structures, and ribosomal binding sites. Specifically, we analyzed the origins of replication pCM62 (IncP), pBBR1, pKT (IncQ), and their variants. We tested the promoters P(BAD), T7, P(xyls/PM), P(lacUV5), and variants thereof for inducible expression. We also evaluated a T7 mRNA stem-loop structure sequence and compared a set of ribosomal binding site (RBS) sequences derived from Escherichia coli, R. eutropha, and a computational RBS design tool. Finally, we employed the toolbox to optimize hydrocarbon production in R. eutropha and demonstrated a 6-fold titer improvement using the appropriate combination of parts. We constructed and evaluated a versatile synthetic biology toolbox for Ralstonia eutropha metabolic engineering that could apply to other microbial hosts as well.

  14. Multimodal Imaging of Brain Connectivity Using the MIBCA Toolbox: Preliminary Application to Alzheimer's Disease

    Science.gov (United States)

    Ribeiro, André Santos; Lacerda, Luís Miguel; Silva, Nuno André da; Ferreira, Hugo Alexandre

    2015-06-01

    The Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox is a fully automated all-in-one connectivity analysis toolbox that offers both pre-processing, connectivity, and graph theory analysis of multimodal images such as anatomical, diffusion, and functional MRI, and PET. In this work, the MIBCA functionalities were used to study Alzheimer's Disease (AD) in a multimodal MR/PET approach. Materials and Methods: Data from 12 healthy controls, and 36 patients with EMCI, LMCI and AD (12 patients for each group) were obtained from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database (adni.loni.usc.edu), including T1-weighted (T1-w), Diffusion Tensor Imaging (DTI) data, and 18F-AV-45 (florbetapir) dynamic PET data from 40-60 min post injection (4x5 min). Both MR and PET data were automatically pre-processed for all subjects using MIBCA. T1-w data was parcellated into cortical and subcortical regions-of-interest (ROIs), and the corresponding thicknesses and volumes were calculated. DTI data was used to compute structural connectivity matrices based on fibers connecting pairs of ROIs. Lastly, dynamic PET images were summed, and the relative Standard Uptake Values calculated for each ROI. Results: An overall higher uptake of 18F-AV-45, consistent with an increased deposition of beta-amyloid, was observed for the AD group. Additionally, patients showed significant cortical atrophy (thickness and volume) especially in the entorhinal cortex and temporal areas, and a significant increase in Mean Diffusivity (MD) in the hippocampus, amygdala and temporal areas. Furthermore, patients showed a reduction of fiber connectivity with the progression of the disease, especially for intra-hemispherical connections. Conclusion: This work shows the potential of the MIBCA toolbox for the study of AD, as findings were shown to be in agreement with the literature. Here, only structural changes and beta-amyloid accumulation were considered. Yet, MIBCA is further able to

  15. Changing pattern of landslide risk in Europe - The SafeLand project

    Science.gov (United States)

    Nadim, F.; Kalsnes, B.

    2012-04-01

    The need to protect people and property with a changing pattern of landslide hazard and risk caused by climate change and changes in demography, and the reality for societies in Europe to live with the risk associated with natural hazards, were the motives for the project SafeLand: "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies." SafeLand is a large, integrating research project under the European Commission's 7th Framework Programme (FP7). The project started on 1 May 2009 and will end on 30 April 2012. It involves 27 partners from 12 European countries, and has international collaborators and advisers from China, India, USA, Japan and Hong Kong. SafeLand also involves 25 End-Users from 11 countries. SafeLand is coordinated by the International Centre for Geohazards (ICG) at Norwegian Geotechnical Institute in Norway. Further information on the SafeLand project can be found at its web site http://safeland-fp7.eu/. Main results achieved in SafeLand include: - Various guidelines related to landslide triggering processes and run-out modelling. - Development and testing of several empirical methods for predicting the characteristics of threshold rainfall events for triggering of precipitation-induced landslides, and development of an empirical model for assessing the changes in landslide frequency (hazard) as a function of changes in the demography and population density. - Guideline for landslide susceptibility, hazard and risk assessment and zoning. - New methodologies for physical and societal vulnerability assessment. - Identification of landslide hazard and risk hotspots for Europe. The results show clearly where areas with the largest landslide risk are located in Europe and the objective approach allows a ranking of the countries by exposed area and population. - Different regional and local climate model simulations over selected regions of Europe at spatial resolutions of 10x10 km and 2.8x2.8 km

  16. Load forecasting using different architectures of neural networks with the assistance of the MATLAB toolboxes; Previsao de cargas eletricas utilizando diferentes arquiteturas de redes neurais artificiais com o auxilio das toolboxes do MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Nose Filho, Kenji; Araujo, Klayton A.M.; Maeda, Jorge L.Y.; Lotufo, Anna Diva P. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Ilha Solteira, SP (Brazil)], Emails: kenjinose@yahoo.com.br, klayton_ama@hotmail.com, jorge-maeda@hotmail.com, annadiva@dee.feis.unesp.br

    2009-07-01

    This paper presents a development and implementation of a program to electrical load forecasting with data from a Brazilian electrical company, using four different architectures of neural networks of the MATLAB toolboxes: multilayer backpropagation gradient descendent with momentum, multilayer backpropagation Levenberg-Marquardt, adaptive network based fuzzy inference system and general regression neural network. The program presented a satisfactory performance, guaranteeing very good results. (author)

  17. EEGVIS: A MATLAB toolbox for browsing, exploring, and viewing large datasets

    Directory of Open Access Journals (Sweden)

    Kay A Robbins

    2012-05-01

    Full Text Available Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/ eegvis.

  18. EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.

    Science.gov (United States)

    Robbins, Kay A

    2012-01-01

    Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.

  19. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    Science.gov (United States)

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  20. Virtual reality in decommissioning projects: experiences, lessons learned and future plans

    International Nuclear Information System (INIS)

    Rindahl, G.; Mark, N.K.F.; Meyer, G.

    2006-01-01

    The work on Virtual Reality (VR) tools for decommissioning planning, dose estimation and work management started at the Norwegian Institute for Energy Technology (IFE) in 1999 in the VR dose project with Japan Nuclear Cycle development institute (JNC), now JAEA. The main aim of this effort has been to help minimize workers' radiation exposure, as well as help to achieve more efficient use of human resources. VR dose is now used in the decommissioning of one of JNC's reactors, the Fugen Nuclear Power Station. This VR decommissioning project has later resulted in a series of projects and applications. In addition to decommissioning, IFE also put great focus on two other branches of VR tools, namely tools for knowledge management, training and education in operating facilities and tools for control room design. During the last years, this work, beginning at different ends, has been converging more and more towards VR technology for use through out the life cycle of a facility. A VR training simulator for a refuelling machine of the Leningrad NPP (LNPP) developed in cooperation with the Russian Research Centre Kurchatov Institute (RRC KI) is now planned to be used in connection with the decommissioning of the three intact reactors at Chernobyl in Ukraine. In this paper we describe experiences from use of VR in decommissioning processes, as well as results from bringing the VR technology initially developed for planned or productive facilities into the decommissioning toolbox. (author)

  1. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  2. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  3. Java simulations of embedded control systems.

    Science.gov (United States)

    Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco

    2010-01-01

    This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt.

  4. Collaborative Management of Complex Major Construction Projects: AnyLogic-Based Simulation Modelling

    Directory of Open Access Journals (Sweden)

    Na Zhao

    2016-01-01

    Full Text Available Complex supply chain system collaborative management of major construction projects effectively integrates the different participants in the construction project. This paper establishes a simulation model based on AnyLogic to reveal the collaborative elements in the complex supply chain management system and the modes of action as well as the transmission problems of the intent information. Thus it is promoting the participants to become an organism with coordinated development and coevolution. This study can help improve the efficiency and management of the complex system of major construction projects.

  5. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Directory of Open Access Journals (Sweden)

    D. J. Swales

    2018-01-01

    Full Text Available The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  6. Extreme climate in China. Facts, simulation and projection

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hui-Jun; Sun, Jian-Qi; Chen, Huo-Po; Zhu, Ya-Li; Zhang, Ying; Jiang, Da-Bang; Lang, Xian-Mei; Fan, Ke; Yu, En-Tao [Chinese Academy of Sciences, Beijing (China). Inst. of Atmospheric Physics; Yang, Song [NOAA Climate Prediction Center, Camp Springs, MD (United States)

    2012-06-15

    In this paper, studies on extreme climate in China including extreme temperature and precipitation, dust weather activity, tropical cyclone activity, intense snowfall and cold surge activity, floods, and droughts are reviewed based on the peer-reviewed publications in recent decades. The review is focused first on the climatological features, variability, and trends in the past half century and then on simulations and projections based on global and regional climate models. As the annual mean surface air temperature (SAT) increased throughout China, heat wave intensity and frequency overall increased in the past half century, with a large rate after the 1980s. The daily or yearly minimum SAT increased more significantly than the mean or maximum SAT. The long-term change in precipitation is predominantly characterized by the so-called southern flood and northern drought pattern in eastern China and by the overall increase over Northwest China. The interdecadal variation of monsoon, represented by the monsoon weakening in the end of 1970s, is largely responsible for this change in mean precipitation. Precipitation-related extreme events (e.g., heavy rainfall and intense snowfall) have become more frequent and intense generally over China in the recent years, with large spatial features. Dust weather activity, however, has become less frequent over northern China in the recent years, as result of weakened cold surge activity, reinforced precipitation, and improved vegetation condition. State-of-the-art climate models are capable of reproducing some features of the mean climate and extreme climate events. However, discrepancies among models in simulating and projecting the mean and extreme climate are also demonstrated by many recent studies. Regional models with higher resolutions often perform better than global models. To predict and project climate variations and extremes, many new approaches and schemes based on dynamical models, statistical methods, or their

  7. UAV Flight Control Based on RTX System Simulation Platform

    Directory of Open Access Journals (Sweden)

    Xiaojun Duan

    2014-03-01

    Full Text Available This paper proposes RTX and Matlab UAV flight control system simulation platform based on the advantages and disadvantages of Windows and real-time system RTX. In the simulation platform, we set the RTW toolbox configuration and modify grt_main.c in order to make simulation platform endowed with online parameter adjustment, fault injection. Meanwhile, we develop the interface of the system simulation platform by CVI, thus it makes effective and has good prospects in application. In order to improve the real-time performance of simulation system, the current computer of real-time simulation mostly use real-time operating system to solve simulation model, as well as dual- framework containing in Host and target machine. The system is complex, high cost, and generally used for the control and half of practical system simulation. For the control system designers, they expect to design control law at a computer with Windows-based environment and conduct real-time simulation. This paper proposes simulation platform for UAV flight control system based on RTX and Matlab for this demand.

  8. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis.

    Science.gov (United States)

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/.

  9. A Toolbox of Genetically Encoded FRET-Based Biosensors for Rapid l-Lysine Analysis

    Directory of Open Access Journals (Sweden)

    Victoria Steffen

    2016-09-01

    Full Text Available Background: The fast development of microbial production strains for basic and fine chemicals is increasingly carried out in small scale cultivation systems to allow for higher throughput. Such parallelized systems create a need for new rapid online detection systems to quantify the respective target compound. In this regard, biosensors, especially genetically encoded Förster resonance energy transfer (FRET-based biosensors, offer tremendous opportunities. As a proof-of-concept, we have created a toolbox of FRET-based biosensors for the ratiometric determination of l-lysine in fermentation broth. Methods: The sensor toolbox was constructed based on a sensor that consists of an optimized central lysine-/arginine-/ornithine-binding protein (LAO-BP flanked by two fluorescent proteins (enhanced cyan fluorescent protein (ECFP, Citrine. Further sensor variants with altered affinity and sensitivity were obtained by circular permutation of the binding protein as well as the introduction of flexible and rigid linkers between the fluorescent proteins and the LAO-BP, respectively. Results: The sensor prototype was applied to monitor the extracellular l-lysine concentration of the l-lysine producing Corynebacterium glutamicum (C. glutamicum strain DM1933 in a BioLector® microscale cultivation device. The results matched well with data obtained by HPLC analysis and the Ninhydrin assay, demonstrating the high potential of FRET-based biosensors for high-throughput microbial bioprocess optimization.

  10. Brief introduction to project management of full scope simulator for Qinshan 300 MW Nuclear Power Unit

    International Nuclear Information System (INIS)

    Chen Jie

    1996-01-01

    The key points in development and engineering project management of full scope simulator for Qinshan 300 MW Nuclear Power Unit are briefly introduced. The Gantt chart, some project management methods and experience are presented. The key points analysis along with the project procedure will be useful to the similar project

  11. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  12. Final results of the supra project : Improved Simulation of Upset Recovery

    NARCIS (Netherlands)

    Fucke, L.; Groen, E.; Goman, M.; Abramov, N.; Wentink, M.; Nooij, S.; Zaichik, L.E.; Khrabrov, A.

    2012-01-01

    The objective of the European research project SUPRA (Simulation of Upset Recovery in Aviation) is to develop technologies that eventually contribute to a reduction of risk of Loss of control - in flight (LOC-I) accidents, today's major cause of fatal accidents in commercial aviation. To this end

  13. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  14. Evolution and experience with the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00389536; The ATLAS collaboration; Brasolin, Franco; Kouba, Tomas; Schovancova, Jaroslava; Fazio, Daniel; Di Girolamo, Alessandro; Scannicchio, Diana; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander; Lee, Christopher

    2017-01-01

    The Simulation at Point1 project is successfully running standard ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We present our experience with using the Event Service that provides the event-level granularity of computations. We show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources is also presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  15. Evolution and experience with the ATLAS simulation at Point1 project

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Fazio, Daniel; Di Girolamo, Alessandro; Kouba, Tomas; Lee, Christopher; Scannicchio, Diana; Schovancova, Jaroslava; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2016-01-01

    The Simulation at Point1 project is successfully running traditional ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We will present our experience with using the Event Service that provides the event-level granularity of computations. We will show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources will also be presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  16. Evaluation of an Advanced Harmonic Filter for Adjustable Speed Drives using a Toolbox Approach

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian; Hansen, Steffan; Blaabjerg, Frede

    2004-01-01

    A large diversity of solutions exists to reduce the harmonic emission of the 6-pulse Adjustable Speed Drive in order to fulfill the requirements of the international harmonic standards. Among them, new types of advanced harmonic filters recently gained an increased attention due to their good...... a combination of a pre-stored database and new interpolation techniques the toolbox can provide the harmonic data on real applications allowing comparisons between different mitigation solutions....

  17. PredPsych: A toolbox for predictive machine learning based approach in experimental psychology research

    OpenAIRE

    Cavallo, Andrea; Becchio, Cristina; Koul, Atesh

    2016-01-01

    Recent years have seen an increased interest in machine learning based predictive methods for analysing quantitative behavioural data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible software framework. The goal of this work was to build an open-source toolbox – “PredPsych” – that could make these methods readily available to all psychologists. PredPsych is a...

  18. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    Science.gov (United States)

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.; Erban, R.; Giles, M.; Maini, P. K.

    2011-01-01

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new

  20. Switch of Sensitivity Dynamics Revealed with DyGloSA Toolbox for Dynamical Global Sensitivity Analysis as an Early Warning for System's Critical Transition

    Science.gov (United States)

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574

  1. Switch of sensitivity dynamics revealed with DyGloSA toolbox for dynamical global sensitivity analysis as an early warning for system's critical transition.

    Science.gov (United States)

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.

  2. Spectral imaging toolbox: segmentation, hyperstack reconstruction, and batch processing of spectral images for the determination of cell and model membrane lipid order.

    Science.gov (United States)

    Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor

    2017-05-12

    Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The

  3. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  4. A Computational Framework for Efficient Low Temperature Plasma Simulations

    Science.gov (United States)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  5. A New Method to Simulate Free Surface Flows for Viscoelastic Fluid

    Directory of Open Access Journals (Sweden)

    Yu Cao

    2015-01-01

    Full Text Available Free surface flows arise in a variety of engineering applications. To predict the dynamic characteristics of such problems, specific numerical methods are required to accurately capture the shape of free surface. This paper proposed a new method which combined the Arbitrary Lagrangian-Eulerian (ALE technique with the Finite Volume Method (FVM to simulate the time-dependent viscoelastic free surface flows. Based on an open source CFD toolbox called OpenFOAM, we designed an ALE-FVM free surface simulation platform. In the meantime, the die-swell flow had been investigated with our proposed platform to make a further analysis of free surface phenomenon. The results validated the correctness and effectiveness of the proposed method for free surface simulation in both Newtonian fluid and viscoelastic fluid.

  6. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  7. A simulation study on the focal plane detector of the LAUE project

    International Nuclear Information System (INIS)

    Khalil, M.; Frontera, F.; Caroli, E.; Virgilli, E.; Valsan, V.

    2015-01-01

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium). - Highlights: • The quantized Hall plateaus and Shubnikov de Haas oscillations in transition metal doped topological insulators are observed. • The evidence of a two-dimensional/layered transport of the bulk electrons is reported. • An obvious ferromagnetism in doped topological insulators is observed. • Care should be taken to pindown the transport of the topological SS in topological insulators

  8. A simulation study on the focal plane detector of the LAUE project

    Energy Technology Data Exchange (ETDEWEB)

    Khalil, M., E-mail: mkhalil@in2p3.fr [APC Laboratory, 10 rue Alice Domon et Léonie Duquet, 75205 Paris Cedex 13 (France); Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); Frontera, F. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy); INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Caroli, E. [INAF/IASF-Bologna, Via P. Gobetti 101, Bologna (Italy); Virgilli, E.; Valsan, V. [Department of Physics and Earth Sciences, University of Ferrara, Via Saragat, 1, 44100 Ferrara (Italy)

    2015-06-21

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium). - Highlights: • The quantized Hall plateaus and Shubnikov de Haas oscillations in transition metal doped topological insulators are observed. • The evidence of a two-dimensional/layered transport of the bulk electrons is reported. • An obvious ferromagnetism in doped topological insulators is observed. • Care should be taken to pindown the transport of the topological SS in topological insulators.

  9. Politician2.0 on Facebook: Information Behavior and Dissemination on Social Networking Sites – Gaps and Best-Practice. Evaluation Results of a novel eParticipation toolbox to let politicians engage with citizens online.

    Directory of Open Access Journals (Sweden)

    Timo Wandhoefer

    2012-01-01

    Full Text Available This article covers our findings on information behavior and dissemination of parliamentary decision-makers in terms of using Social Networking Sites like Facebook. The article investigates why politicians use those technologies and integrate them more and more in their everyday workflow. In addition to the purpose of social network usage, the focus of our paper is also on best practices and how to deal with challenges like authenticity of politicians’ online profiles. The results presented within the remit of this paper are the outcome of 16 semi-structured interviews that took place as part of an evaluation effort within the EU research project WeGov [1]. The overall aim of the project is to develop a toolbox that enriches the dialogue between citizens and politicians on the web.

  10. Numerical solution of multi groups point kinetic equations by simulink toolbox of Matlab software

    International Nuclear Information System (INIS)

    Hadad, K.; Mohamadi, A.; Sabet, H.; Ayobian, N.; Khani, M.

    2004-01-01

    The simulink toolbox of Matlab Software was employed to solve the point kinetics equation with six group delayed neutrons. The method of Adams-Bash ford showed a good convergence in solving the system of simultaneous equations and the obtained results showed good agreements with other numerical schemes. The flexibility of the package in changing the system parameters and the user friendly interface makes this approach a reliable educational package in revealing the affects of reactivity changes on power incursions

  11. Grid integrated distributed PV (GridPV).

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Matthew J.; Coogan, Kyle

    2013-08-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions. Each function in the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.

  12. Regensim – Matlab toolbox for renewable energy sources modelling and simulation

    Directory of Open Access Journals (Sweden)

    Cristian Dragoş Dumitru

    2011-12-01

    Full Text Available This paper deals with the implementation and development of a Matlab Simulink library named RegenSim designed for modeling, simulations and analysis of real hybrid solarwind-hydro systems connected to local grids. Blocks like wind generators, hydro generators, solar photovoltaic modules and accumulators are implemented. The main objective is the study of the hybrid power system behavior, which allows employing renewable and variable in time energy sources while providing a continuous supply.

  13. Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere

    Science.gov (United States)

    Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers

  14. Simulations of Aperture Synthesis Imaging Radar for the EISCAT_3D Project

    Science.gov (United States)

    La Hoz, C.; Belyey, V.

    2012-12-01

    EISCAT_3D is a project to build the next generation of incoherent scatter radars endowed with multiple 3-dimensional capabilities that will replace the current EISCAT radars in Northern Scandinavia. Aperture Synthesis Imaging Radar (ASIR) is one of the technologies adopted by the EISCAT_3D project to endow it with imaging capabilities in 3-dimensions that includes sub-beam resolution. Complemented by pulse compression, it will provide 3-dimensional images of certain types of incoherent scatter radar targets resolved to about 100 metres at 100 km range, depending on the signal-to-noise ratio. This ability will open new research opportunities to map small structures associated with non-homogeneous, unstable processes such as aurora, summer and winter polar radar echoes (PMSE and PMWE), Natural Enhanced Ion Acoustic Lines (NEIALs), structures excited by HF ionospheric heating, meteors, space debris, and others. To demonstrate the feasibility of the antenna configurations and the imaging inversion algorithms a simulation of synthetic incoherent scattering data has been performed. The simulation algorithm incorporates the ability to control the background plasma parameters with non-homogeneous, non-stationary components over an extended 3-dimensional space. Control over the positions of a number of separated receiving antennas, their signal-to-noise-ratios and arriving phases allows realistic simulation of a multi-baseline interferometric imaging radar system. The resulting simulated data is fed into various inversion algorithms. This simulation package is a powerful tool to evaluate various antenna configurations and inversion algorithms. Results applied to realistic design alternatives of EISCAT_3D will be described.

  15. PREVIEW behavior modification intervention toolbox (PREMIT: a study protocol for a psychological element of a multicenter project

    Directory of Open Access Journals (Sweden)

    Daniela Kahlert

    2016-08-01

    Full Text Available Background: Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. Objective: To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counselling approach. Methods: The program development took five progressive steps, in line with the Public Health Action Cycle: (1 Summing-up the intervention goal(s, target group and the setting, (2 uncovering the generative psychological mechanisms, (3 identifying behavior change techniques and tools, (4 preparing for evaluation and (5 implementing the intervention and assuring quality. Results: PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major ‘product’ of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participant’s behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and train-the-trainer workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre

  16. PREVIEW Behavior Modification Intervention Toolbox (PREMIT): A Study Protocol for a Psychological Element of a Multicenter Project.

    Science.gov (United States)

    Kahlert, Daniela; Unyi-Reicherz, Annelie; Stratton, Gareth; Meinert Larsen, Thomas; Fogelholm, Mikael; Raben, Anne; Schlicht, Wolfgang

    2016-01-01

    Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT) that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counseling approach. The program development took five progressive steps, in line with the Public Health Action Cycle: (1) Summing-up the intervention goal(s), target group and the setting, (2) uncovering the generative psychological mechanisms, (3) identifying behavior change techniques and tools, (4) preparing for evaluation and (5) implementing the intervention and assuring quality. PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major "product" of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participants' behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and "train-the-trainer" workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre-diabetic people.

  17. Solution and implementation of project ''Simulator of WWER-440 nuclear power plant''

    International Nuclear Information System (INIS)

    Koukal, V.

    1985-01-01

    The time data are given of the development and construction of the simulator of a WWER-440 nuclear power plant unit. The individual tasks are summed up which are related to the project implementation, and cooperating institutions and enterprises are listed. (J.C.)

  18. An Emulator Toolbox to Approximate Radiative Transfer Models with Statistical Learning

    Directory of Open Access Journals (Sweden)

    Juan Pablo Rivera

    2015-07-01

    Full Text Available Physically-based radiative transfer models (RTMs help in understanding the processes occurring on the Earth’s surface and their interactions with vegetation and atmosphere. When it comes to studying vegetation properties, RTMs allows us to study light interception by plant canopies and are used in the retrieval of biophysical variables through model inversion. However, advanced RTMs can take a long computational time, which makes them unfeasible in many real applications. To overcome this problem, it has been proposed to substitute RTMs through so-called emulators. Emulators are statistical models that approximate the functioning of RTMs. Emulators are advantageous in real practice because of the computational efficiency and excellent accuracy and flexibility for extrapolation. We hereby present an “Emulator toolbox” that enables analysing multi-output machine learning regression algorithms (MO-MLRAs on their ability to approximate an RTM. The toolbox is included in the free-access ARTMO’s MATLAB suite for parameter retrieval and model inversion and currently contains both linear and non-linear MO-MLRAs, namely partial least squares regression (PLSR, kernel ridge regression (KRR and neural networks (NN. These MO-MLRAs have been evaluated on their precision and speed to approximate the soil vegetation atmosphere transfer model SCOPE (Soil Canopy Observation, Photochemistry and Energy balance. SCOPE generates, amongst others, sun-induced chlorophyll fluorescence as the output signal. KRR and NN were evaluated as capable of reconstructing fluorescence spectra with great precision. Relative errors fell below 0.5% when trained with 500 or more samples using cross-validation and principal component analysis to alleviate the underdetermination problem. Moreover, NN reconstructed fluorescence spectra about 50-times faster and KRR about 800-times faster than SCOPE. The Emulator toolbox is foreseen to open new opportunities in the use of advanced

  19. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States)

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  20. PROJECTED PRECIPITATION CHANGES IN CENTRAL/EASTERN EUROPE ON THE BASIS OF ENSEMBLE SIMULATIONS

    Directory of Open Access Journals (Sweden)

    Erika Miklos

    2012-03-01

    Full Text Available Projected precipitation changes in Central/Eastern Europe on the basis of ENSEMBLE simulations. For building appropriate local/national adaptation and mitigation strategies, detailed analysis of regional climate change is essential. In order to estimate the climate change for the 21st century, both global and regional models may be used. However, due to the coarse horizontal resolution, global climate models are not appropriate to describe regional scale climate processes. On the other hand, regional climate models (RCMs provide more realistic regional climate scenarios. A wide range of RCM experiments was accomplished in the frame of the ENSEMBLES project funded by the EU FP6 program, which was one of the largest climate change research project ever completed. All the RCM experiments used 25 km horizontal resolution and the A1B emission scenario, according to which CO2 concentration by 2100 is estimated to exceed 700 ppm, i.e., more than twice of the preindustrial level.The 25 km spatial resolution is fine enough to estimate the future hydrology-related conditions in different parts of Europe, from which we separated and analyzed simulated climate data sets for the Central/Eastern European region. Precipitation is an especially important climatological variable because of agricultural aspects and flood-related natural hazards, which may seriously affect all the countries in the evaluated region. On the basis of our results, different RCM simulations generally project drier summers and wetter winters (compared to the recent decades. The southern countries are more likely to suffer more intense warming, especially, in summer, and also, more intense drought events due to the stronger Mediterranean impact.

  1. Design of a Toolbox of RNA Thermometers.

    Science.gov (United States)

    Sen, Shaunak; Apurva, Divyansh; Satija, Rohit; Siegal, Dan; Murray, Richard M

    2017-08-18

    Biomolecular temperature sensors can be used for efficient control of large-volume bioreactors, for spatiotemporal imaging and control of gene expression, and to engineer robustness to temperature in biomolecular circuit design. Although RNA-based sensors, called "thermometers", have been investigated in both natural and synthetic contexts, an important challenge is to design diverse responses to temperature differing in sensitivity and threshold. We address this issue by constructing a library of RNA thermometers based on thermodynamic computations and experimentally measuring their activities in cell-free biomolecular "breadboards". Using free energies of the minimum free energy structures as well as melt profile computations, we estimated that a diverse set of temperature responses were possible. We experimentally found a wide range of responses to temperature in the range 29-37 °C with fold-changes varying over 3-fold around the starting thermometer. The sensitivities of these responses ranged over 10-fold around the starting thermometer. We correlated these measurements with computational expectations, finding that although there was no strong correlation for the individual thermometers, overall trends of diversity, fold-changes, and sensitivities were similar. These results present a toolbox of RNA-based circuit elements with diverse temperature responses.

  2. Comparison of projection skills of deterministic ensemble methods using pseudo-simulation data generated from multivariate Gaussian distribution

    Science.gov (United States)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2017-07-01

    The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.

  3. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    Science.gov (United States)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  4. Grid Integrated Distributed PV (GridPV) Version 2.

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.

  5. FOLDER: A numerical tool to simulate the development of structures in layered media

    Science.gov (United States)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2015-04-01

    FOLDER is a numerical toolbox for modelling deformation in layered media during layer parallel shortening or extension in two dimensions. FOLDER builds on MILAMIN [1], a finite element method based mechanical solver, with a range of utilities included from the MUTILS package [2]. Numerical mesh is generated using the Triangle software [3]. The toolbox includes features that allow for: 1) designing complex structures such as multi-layer stacks, 2) accurately simulating large-strain deformation of linear and non-linear viscous materials, 3) post-processing of various physical fields such as velocity (total and perturbing), rate of deformation, finite strain, stress, deviatoric stress, pressure, apparent viscosity. FOLDER is designed to ensure maximum flexibility to configure model geometry, define material parameters, specify range of numerical parameters in simulations and choose the plotting options. FOLDER is an open source MATLAB application and comes with a user friendly graphical interface. The toolbox additionally comprises an educational application that illustrates various analytical solutions of growth rates calculated for the cases of folding and necking of a single layer with interfaces perturbed with a single sinusoidal waveform. We further derive two novel analytical expressions for the growth rate in the cases of folding and necking of a linear viscous layer embedded in a linear viscous medium of a finite thickness. We use FOLDER to test the accuracy of single-layer folding simulations using various 1) spatial and temporal resolutions, 2) time integration schemes, and 3) iterative algorithms for non-linear materials. The accuracy of the numerical results is quantified by: 1) comparing them to analytical solution, if available, or 2) running convergence tests. As a result, we provide a map of the most optimal choice of grid size, time step, and number of iterations to keep the results of the numerical simulations below a given error for a given time

  6. ParTIES: a toolbox for Paramecium interspersed DNA elimination studies.

    Science.gov (United States)

    Denby Wilkes, Cyril; Arnaiz, Olivier; Sperling, Linda

    2016-02-15

    Developmental DNA elimination occurs in a wide variety of multicellular organisms, but ciliates are the only single-celled eukaryotes in which this phenomenon has been reported. Despite considerable interest in ciliates as models for DNA elimination, no standard methods for identification and characterization of the eliminated sequences are currently available. We present the Paramecium Toolbox for Interspersed DNA Elimination Studies (ParTIES), designed for Paramecium species, that (i) identifies eliminated sequences, (ii) measures their presence in a sequencing sample and (iii) detects rare elimination polymorphisms. ParTIES is multi-threaded Perl software available at https://github.com/oarnaiz/ParTIES. ParTIES is distributed under the GNU General Public Licence v3. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The influence of industrial applications on a control system toolbox

    International Nuclear Information System (INIS)

    Clout, P.

    1992-01-01

    Vsystem is as an open, advanced software application toolbox for rapidly creating fast, efficient and cost-effective control and data-acquisition systems. Vsystem's modular architecture is designed for single computers, networked computers and workstations running under VAX/VMS or VAX/ELN. At the heart of Vsystem lies Vaccess, a user extendible real-time database and library of access routines. The application database provides the link to the hardware of the application and can be organized as one database or separate database installed in different computers on the network. Vsystem has found application in charged-particle accelerator control, tokamak control, and industrial research, as well as its more recent industrial applications. This paper describes the broad feature of Vsystem and the influence that recent industrial applications have had on the software. (author)

  8. Technical skills assessment toolbox: a review using the unitary framework of validity.

    Science.gov (United States)

    Ghaderi, Iman; Manji, Farouq; Park, Yoon Soo; Juul, Dorthea; Ott, Michael; Harris, Ilene; Farrell, Timothy M

    2015-02-01

    The purpose of this study was to create a technical skills assessment toolbox for 35 basic and advanced skills/procedures that comprise the American College of Surgeons (ACS)/Association of Program Directors in Surgery (APDS) surgical skills curriculum and to provide a critical appraisal of the included tools, using contemporary framework of validity. Competency-based training has become the predominant model in surgical education and assessment of performance is an essential component. Assessment methods must produce valid results to accurately determine the level of competency. A search was performed, using PubMed and Google Scholar, to identify tools that have been developed for assessment of the targeted technical skills. A total of 23 assessment tools for the 35 ACS/APDS skills modules were identified. Some tools, such as Operative Performance Rating System (OSATS) and Objective Structured Assessment of Technical Skill (OPRS), have been tested for more than 1 procedure. Therefore, 30 modules had at least 1 assessment tool, with some common surgical procedures being addressed by several tools. Five modules had none. Only 3 studies used Messick's framework to design their validity studies. The remaining studies used an outdated framework on the basis of "types of validity." When analyzed using the contemporary framework, few of these studies demonstrated validity for content, internal structure, and relationship to other variables. This study provides an assessment toolbox for common surgical skills/procedures. Our review shows that few authors have used the contemporary unitary concept of validity for development of their assessment tools. As we progress toward competency-based training, future studies should provide evidence for various sources of validity using the contemporary framework.

  9. 3D numerical simulation of projection welding of square nuts to sheets

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Zhang, W.; Martins, P. A. F.

    2015-01-01

    formulation inorder to model the frictional sliding between the square nut projections and the sheets during the weld-ing process. It is proved that the implementation of friction increases the accuracy of the simulations,and the dynamic influence of friction on the process is explained.© 2014 Elsevier B......The challenge of developing a three-dimensional finite element computer program for electro-thermo-mechanical industrial modeling of resistance welding is presented, and the program is applied to thesimulation of projection welding of square nuts to sheets. Results are compared with experimental...

  10. Visual NNet: An Educational ANN's Simulation Environment Reusing Matlab Neural Networks Toolbox

    Science.gov (United States)

    Garcia-Roselló, Emilio; González-Dacosta, Jacinto; Lado, Maria J.; Méndez, Arturo J.; Garcia Pérez-Schofield, Baltasar; Ferrer, Fátima

    2011-01-01

    Artificial Neural Networks (ANN's) are nowadays a common subject in different curricula of graduate and postgraduate studies. Due to the complex algorithms involved and the dynamic nature of ANN's, simulation software has been commonly used to teach this subject. This software has usually been developed specifically for learning purposes, because…

  11. Electroacoustical simulation of listening room acoustics for project ARCHIMEDES

    DEFF Research Database (Denmark)

    Bech, Søren

    1989-01-01

    ARCHIMEDES is a psychoacoustics research project, funded under the European EUREKA scheme. Three partners share the work involved: The Acoustics Laboratory of The Technical University of Denmark; Bang and Olufsen of Denmark; and KEF Electronics of England. Its primary object is to quantify...... the influence of listening room acoustics on the timbre of reproduced sound. For simulation of the acoustics of a standard listening room, an electroacoustic setup has been built in an anechoic chamber. The setup is based on a computer model of the listening room, and it consists of a number of loudspeakers...

  12. Summer Student Project: GEM Simulation and Gas Mixture Characterization

    CERN Document Server

    Oviedo Perhavec, Juan Felipe

    2013-01-01

    Abstract This project is a numerical simulation approach to Gas Electron Multiplier (GEM) detectors design. GEMs are a type of gaseous ionization detector that have proposed as an upgrade for CMS muon endcap. The main advantages of this technology are high spatial and time resolution and outstanding aging resistance. In this context, fundamental physical behavior of a Gas Electron Multiplier (GEM) is analyzed using ANSYS and Garfield++ software coupling. Essential electron transport properties for several gas mixtures were computed as a function of varying electric and magnetic field using Garfield++ and Magboltz.

  13. A spike sorting toolbox for up to thousands of electrodes validated with ground truth recordings in vitro and in vivo

    Science.gov (United States)

    Lefebvre, Baptiste; Deny, Stéphane; Gardella, Christophe; Stimberg, Marcel; Jetter, Florian; Zeck, Guenther; Picaud, Serge; Duebel, Jens

    2018-01-01

    In recent years, multielectrode arrays and large silicon probes have been developed to record simultaneously between hundreds and thousands of electrodes packed with a high density. However, they require novel methods to extract the spiking activity of large ensembles of neurons. Here, we developed a new toolbox to sort spikes from these large-scale extracellular data. To validate our method, we performed simultaneous extracellular and loose patch recordings in rodents to obtain ‘ground truth’ data, where the solution to this sorting problem is known for one cell. The performance of our algorithm was always close to the best expected performance, over a broad range of signal-to-noise ratios, in vitro and in vivo. The algorithm is entirely parallelized and has been successfully tested on recordings with up to 4225 electrodes. Our toolbox thus offers a generic solution to sort accurately spikes for up to thousands of electrodes. PMID:29557782

  14. The APOSTLE project: Local Group kinematic mass constraints and simulation candidate selection

    Science.gov (United States)

    Fattahi, Azadeh; Navarro, Julio F.; Sawala, Till; Frenk, Carlos S.; Oman, Kyle A.; Crain, Robert A.; Furlong, Michelle; Schaller, Matthieu; Schaye, Joop; Theuns, Tom; Jenkins, Adrian

    2016-03-01

    We use a large sample of isolated dark matter halo pairs drawn from cosmological N-body simulations to identify candidate systems whose kinematics match that of the Local Group (LG) of galaxies. We find, in agreement with the `timing argument' and earlier work, that the separation and approach velocity of the Milky Way (MW) and Andromeda (M31) galaxies favour a total mass for the pair of ˜5 × 1012 M⊙. A mass this large, however, is difficult to reconcile with the small relative tangential velocity of the pair, as well as with the small deceleration from the Hubble flow observed for the most distant LG members. Halo pairs that match these three criteria have average masses a factor of ˜2 times smaller than suggested by the timing argument, but with large dispersion. Guided by these results, we have selected 12 halo pairs with total mass in the range 1.6-3.6 × 1012 M⊙ for the APOSTLE project (A Project Of Simulating The Local Environment), a suite of hydrodynamical resimulations at various numerical resolution levels (reaching up to ˜104 M⊙ per gas particle) that use the subgrid physics developed for the EAGLE project. These simulations reproduce, by construction, the main kinematics of the MW-M31 pair, and produce satellite populations whose overall number, luminosities, and kinematics are in good agreement with observations of the MW and M31 companions. The APOSTLE candidate systems thus provide an excellent testbed to confront directly many of the predictions of the Λ cold dark matter cosmology with observations of our local Universe.

  15. ALEA: a toolbox for allele-specific epigenomics analysis.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Heravi-Moussavi, Alireza; Cheng, Jeffrey B; Costello, Joseph F; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2014-04-15

    The assessment of expression and epigenomic status using sequencing based methods provides an unprecedented opportunity to identify and correlate allelic differences with epigenomic status. We present ALEA, a computational toolbox for allele-specific epigenomics analysis, which incorporates allelic variation data within existing resources, allowing for the identification of significant associations between epigenetic modifications and specific allelic variants in human and mouse cells. ALEA provides a customizable pipeline of command line tools for allele-specific analysis of next-generation sequencing data (ChIP-seq, RNA-seq, etc.) that takes the raw sequencing data and produces separate allelic tracks ready to be viewed on genome browsers. The pipeline has been validated using human and hybrid mouse ChIP-seq and RNA-seq data. The package, test data and usage instructions are available online at http://www.bcgsc.ca/platform/bioinfo/software/alea CONTACT: : mkarimi1@interchange.ubc.ca or sjones@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Rainfall and Extratropical Transition of Tropical Cyclones: Simulation, Prediction, and Projection

    Science.gov (United States)

    Liu, Maofeng

    Rainfall and associated flood hazards are one of the major threats of tropical cyclones (TCs) to coastal and inland regions. The interaction of TCs with extratropical systems can lead to enhanced precipitation over enlarged areas through extratropical transition (ET). To achieve a comprehensive understanding of rainfall and ET associated with TCs, this thesis conducts weather-scale analyses by focusing on individual storms and climate-scale analyses by focusing on seasonal predictability and changing properties of climatology under global warming. The temporal and spatial rainfall evolution of individual storms, including Hurricane Irene (2011), Hurricane Hanna (2008), and Hurricane Sandy (2012), is explored using the Weather Research and Forecast (WRF) model and a variety of hydrometeorological datasets. ET and Orographic mechanism are two key players in the rainfall distribution of Irene over regions experiencing most severe flooding. The change of TC rainfall under global warming is explored with the Forecast-oriented Low Ocean Resolution (FLOR) climate model under representative concentration pathway (RCP) 4.5 scenario. Despite decreased TC frequency, FLOR projects increased landfalling TC rainfall over most regions of eastern United States, highlighting the risk of increased flood hazards. Increased storm rain rate is an important player of increased landfalling TC rainfall. A higher atmospheric resolution version of FLOR (HiFLOR) model projects increased TC rainfall at global scales. The increase of TC intensity and environmental water vapor content scaled by the Clausius-Clapeyron relation are two key factors that explain the projected increase of TC rainfall. Analyses on the simulation, prediction, and projection of the ET activity with FLOR are conducted in the North Atlantic. FLOR model exhibits good skills in simulating many aspects of present-day ET climatology. The 21st-century-projection under RCP4.5 scenario demonstrates the dominant role of ET

  17. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    International Nuclear Information System (INIS)

    Xu, Y; Tian, Z; Jiang, S; Jia, X; Zhou, L

    2015-01-01

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  18. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Y [UT Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou (China); Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou (China)

    2015-06-15

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  19. Recent advances of molecular toolbox construction expand Pichia pastoris in synthetic biology applications.

    Science.gov (United States)

    Kang, Zhen; Huang, Hao; Zhang, Yunfeng; Du, Guocheng; Chen, Jian

    2017-01-01

    Pichia pastoris: (reclassified as Komagataella phaffii), a methylotrophic yeast strain has been widely used for heterologous protein production because of its unique advantages, such as readily achievable high-density fermentation, tractable genetic modifications and typical eukaryotic post-translational modifications. More recently, P. pastoris as a metabolic pathway engineering platform has also gained much attention. In this mini-review, we addressed recent advances of molecular toolboxes, including synthetic promoters, signal peptides, and genome engineering tools that established for P. pastoris. Furthermore, the applications of P. pastoris towards synthetic biology were also discussed and prospected especially in the context of genome-scale metabolic pathway analysis.

  20. The Symbiose project: an integrated framework for performing environmental radiological risk assessment

    International Nuclear Information System (INIS)

    Gonze, M.A.; Mourlon, C.; Garcia-Sanchez, L.; Beaugelin, K.; Chen, T.; Le Dizes, S.

    2004-01-01

    Human health and ecological risk assessments usually require the integration of a wide range of environmental data and modelling approaches, with a varying level of detail dependent on the management objectives, the complexity of the site and the level of ignorance about the pollutant behaviour/toxicity. Like most scientists and assessors did it recently, we recognized the need for developing comprehensive, integrated and flexible approaches to risk assessment. To meet these needs, IRSN launched the Symbiose project (2002-2006) which aims first, at designing a framework for integrating and managing data, methods and knowledge of some relevance in radiological risk to humans/biota assessment studies, and second, at implementing this framework in an information management, modelling and calculation platform. Feasibility developments (currently completed) led to the specification of a fully integrated, object-oriented and hierarchical approach for describing the fate, transport and effect of radionuclides in spatially-distributed environmental systems. This innovative approach has then been implemented in a platform prototype, main components of which are a user-friendly and modular simulation environment (e.g. using GoldSim toolbox), and a hierarchical object-oriented biosphere database. Both conceptual and technical developments will be presented here. (author)

  1. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    CERN Document Server

    Abbrescia, M; Aiola, S; Antolini, R; Avanzini, C; Baldini Ferroli, R; Bencivenni, G; Bossini, E; Bressan, E; Chiavassa, A; Cicalò, C; Cifarelli, L; Coccia, E; De Gruttola, D; De Pasquale, S; Di Giovanni, A; D'Incecco, M; Dreucci, M; Fabbri, F L; Frolov, V; Garbini, M; Gemme, G; Gnesi, I; Gustavino, C; Hatzifotiadou, D; La Rocca, P; Li, S; Librizzi, F; Maggiora, A; Massai, M; Miozzi, S; Panareo, M; Paoletti, R; Perasso, L; Pilo, F; Piragino, G; Regano, A; Riggi, F; Righini, G C; Sartorelli, G; Scapparone, E; Scribano, A; Selvi, M; Serci, S; Siddi, E; Spandre, G; Squarcia, S; Taiuti, M; Tosello, F; Votano, L; Williams, M C S; Yánez, G; Zichichi, A; Zuyeuski, R

    2014-01-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 10 5 km 2 . In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  2. BlueSky ATC Simulator Project : An Open Data and Open Source Approach

    NARCIS (Netherlands)

    Hoekstra, J.M.; Ellerbroek, J.

    2016-01-01

    To advance ATM research as a science, ATM research results should be made more comparable. A possible way to do this is to share tools and data. This paper presents a project that investigates the feasibility of a fully open-source and open-data approach to air traffic simulation. Here, the first of

  3. Evaluating IPCC AR4 cool-season precipitation simulations and projections for impacts assessment over North America

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, Stephanie A. [The University of Arizona, Department of Geosciences, Tucson, AZ (United States); The Wilderness Society, Anchorage, AK (United States); Russell, Joellen L.; Goodman, Paul J. [The University of Arizona, Department of Geosciences, Tucson, AZ (United States)

    2011-12-15

    General circulation models (GCMs) have demonstrated success in simulating global climate, and they are critical tools for producing regional climate projections consistent with global changes in radiative forcing. GCM output is currently being used in a variety of ways for regional impacts projection. However, more work is required to assess model bias and evaluate whether assumptions about the independence of model projections and error are valid. This is particularly important where models do not display offsetting errors. Comparing simulated 300-hPa zonal winds and precipitation for the late 20th century with reanalysis and gridded precipitation data shows statistically significant and physically plausible associations between positive precipitation biases across all models and a marked increase in zonal wind speed around 30 N, as well as distortions in rain shadow patterns. Over the western United States, GCMs project drier conditions to the south and increasing precipitation to the north. There is a high degree of agreement between models, and many studies have made strong statements about implications for water resources and about ecosystem change on that basis. However, since one of the mechanisms driving changes in winter precipitation patterns appears to be associated with a source of error in simulating mean precipitation in the present, it suggests that greater caution should be used in interpreting impacts related to precipitation projections in this region and that standard assumptions underlying bias correction methods should be scrutinized. (orig.)

  4. Matlab Stability and Control Toolbox: Trim and Static Stability Module

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.

    2006-01-01

    This paper presents the technical background of the Trim and Static module of the Matlab Stability and Control Toolbox. This module performs a low-fidelity stability and control assessment of an aircraft model for a set of flight critical conditions. This is attained by determining if the control authority available for trim is sufficient and if the static stability characteristics are adequate. These conditions can be selected from a prescribed set or can be specified to meet particular requirements. The prescribed set of conditions includes horizontal flight, take-off rotation, landing flare, steady roll, steady turn and pull-up/ push-over flight, for which several operating conditions can be specified. A mathematical model was developed allowing for six-dimensional trim, adjustable inertial properties, asymmetric vehicle layouts, arbitrary number of engines, multi-axial thrust vectoring, engine(s)-out conditions, crosswind and gyroscopic effects.

  5. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    Science.gov (United States)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  6. project SENSE : multimodal simulation with full-body real-time verbal and nonverbal interactions

    NARCIS (Netherlands)

    Miri, Hossein; Kolkmeier, Jan; Taylor, Paul Jonathon; Poppe, Ronald; Heylen, Dirk; Poppe, Ronald; Meyer, John-Jules; Veltkamp, Remco; Dastani, Mehdi

    2016-01-01

    This paper presents a multimodal simulation system, project-SENSE, that combines virtual reality and full-body motion capture technologies with real-time verbal and nonverbal communication. We introduce the technical setup and employed hardware and software of a first prototype. We discuss the

  7. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms.

    Science.gov (United States)

    Howard, Steven J; Melhuish, Edward

    2017-06-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children's emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children's abilities to enable research and educational applications.

  8. Various Solution Methods, Accompanied by Dynamic Investigation, for the Same Problem as a Means for Enriching the Mathematical Toolbox

    Science.gov (United States)

    Oxman, Victor; Stupel, Moshe

    2018-01-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  9. Process, cost modeling and simulations for integrated project development of biomass for fuel and protein

    International Nuclear Information System (INIS)

    Pannir Selvam, P.V.; Wolff, D.M.B.; Souza Melo, H.N.

    1998-01-01

    The construction of the models for biomass project development are described. These models, first constructed using QPRO electronic spread sheet for Windows, are now being developed with the aid of visual and object oriented program as tools using DELPHI V.1 for windows and process simulator SUPERPRO, V.2.7 Intelligent Inc. These models render the process development problems with economic objectives to be solved very rapidly. The preliminary analysis of cost and investments of biomass utilisation projects which are included for this study are: steam, ammonia, carbon dioxide and alkali pretreatment process, methane gas production using anaerobic digestion process, aerobic composting, ethanol fermentation and distillation, effluent treatments using high rate algae production as well as cogeneration of energy for drying. The main project under developments are the biomass valuation projects with the elephant (Napier) grass, sugar cane bagasse and microalgae, using models for mass balance, equipment and production cost. The sensibility analyses are carried out to account for stochastic variation of the process yield, production volume, price variations, using Monte Carlo method. These models allow the identification of economical and scale up problems of the technology. The results obtained with few preliminary project development with few case studies are reported for integrated project development for fuel and protein using process and cost simulation models. (author)

  10. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  11. Direct simulation Monte Carlo method for gas flows in micro-channels with bends with added curvature

    Directory of Open Access Journals (Sweden)

    Tisovský Tomáš

    2017-01-01

    Full Text Available Gas flows in micro-channels are simulated using an open source Direct Simulation Monte Carlo (DSMC code dsmcFOAM for general application to rarefied gas flow written within the framework of the open source C++ toolbox called OpenFOAM. Aim of this paper is to investigate the flow in micro-channel with bend with added curvature. Results are compared with flows in channel without added curvature and equivalent straight channel. Effects of micro-channel bend was already thoroughly investigated by White et al. Geometry proposed by White is also used here for refference.

  12. Evaluation of an educational "toolbox" for improving nursing staff competence and psychosocial work environment in elderly care: results of a prospective, non-randomized controlled intervention.

    Science.gov (United States)

    Arnetz, J E; Hasson, H

    2007-07-01

    Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the

  13. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  14. Prognostic simulation of reinjection-research project geothermal site Neustadt-Glewe/Germany

    Energy Technology Data Exchange (ETDEWEB)

    Poppei, J. [Geothermie Neubrandenburg GmbH (Germany)

    1995-03-01

    For the first time after political and economical changes in Germany a hydrothermal site was put into operation in December 1994. Due to prevailing conditions extraordinary in Central Europe (reservoir temperature 99{degrees}C; 220 g/l salinity) the project Neustadt-Glewe is supported by a comprehensive research program. The wells concerned (a doublet with an internal distance of 1.400 m) open the porous sandstone aquifer with an average thickness of about 53 m in a depth of 2.240m. One point of interest was the pressure and temperature behavior over a period of 10 years considering the fluid viscosity changes due to variable injection temperature. For means of reservoir simulation and prognosing the injection behavior and simulator code TOUGH2 was used.

  15. Online Simulation of Radiation Track Structure Project

    Science.gov (United States)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  16. NPV risk simulation of an open pit gold mine project under the O'Hara cost model by using GAs

    Institute of Scientific and Technical Information of China (English)

    Franco-Sepulveda Giovanni; Campuzano Carlos; Pineda Cindy

    2017-01-01

    This paper analyzes an open pit gold mine project based on the O'Hara cost model. Hypothetical data is proposed based on different authors that have studied open pit gold projects, and variations are proposed according to the probability distributions associated to key variables affecting the NPV, like production level, ore grade, price of ore, and others, so as to see what if, in a gold open pit mine project of 3000 metric tons per day of ore. Two case scenarios were analyzed to simulate the NPV, one where there is low certainty data available, and the other where the information available is of high certainty. Results based on genetic algorithm metaheuristic simulations, which combine basically Montecarlo simulations provided by the Palisade Risk software, the O'Hara cost model, net smelter return and financial analysis tools offered by Excel are reported, in order to determine to which variables of the project is more sensitive the NPV.

  17. ObsPy: A Python Toolbox for Seismology - Recent Developments and Applications

    Science.gov (United States)

    Megies, T.; Krischer, L.; Barsch, R.; Sales de Andrade, E.; Beyreuther, M.

    2014-12-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project dedicated to building a bridge for seismology into the scientific Python ecosystem. It offersa) read and write support for essentially all commonly used waveform, station, and event metadata file formats with a unified interface,b) a comprehensive signal processing toolbox tuned to the needs of seismologists,c) integrated access to all large data centers, web services and databases, andd) convenient wrappers to legacy codes like libtau and evalresp.Python, currently the most popular language for teaching introductory computer science courses at top-ranked U.S. departments, is a full-blown programming language with the flexibility of an interactive scripting language. Its extensive standard library and large variety of freely available high quality scientific modules cover most needs in developing scientific processing workflows. Together with packages like NumPy, SciPy, Matplotlib, IPython, Pandas, lxml, and PyQt, ObsPy enables the construction of complete workflows in Python. These vary from reading locally stored data or requesting data from one or more different data centers through to signal analysis and data processing and on to visualizations in GUI and web applications, output of modified/derived data and the creation of publication-quality figures.ObsPy enjoys a large world-wide rate of adoption in the community. Applications successfully using it include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. All functionality is extensively documented and the ObsPy tutorial and gallery give a good impression of the wide range of possible use cases.We will present the basic features of ObsPy, new developments and applications, and a roadmap for the near future and discuss the sustainability of our open-source development model.

  18. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  19. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  20. Versatile Cas9-Driven Subpopulation Selection Toolbox for Lactococcus lactis.

    Science.gov (United States)

    van der Els, Simon; James, Jennelle K; Kleerebezem, Michiel; Bron, Peter A

    2018-04-15

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis The Cas9 expression vector pLABTarget, encoding the Streptocccus pyogenes Cas9 under the control of a constitutive promoter, was constructed, allowing plug and play introduction of short guide RNA (sgRNA) sequences to target specific genetic loci. Introduction of a pepN -targeting derivative of pLABTarget into L. lactis strain MG1363 led to a strong reduction in the number of transformants obtained, which did not occur in a pepN deletion derivative of the same strain, demonstrating the specificity and lethality of the Cas9-mediated double-strand breaks in the lactococcal chromosome. Moreover, the same pLABTarget derivative allowed the selection of a pepN deletion subpopulation from its corresponding single-crossover plasmid integrant precursor, accelerating the construction and selection of gene-specific deletion derivatives in L. lactis Finally, pLABTarget, which contained sgRNAs designed to target mobile genetic elements, allowed the effective curing of plasmids, prophages, and integrative conjugative elements (ICEs). These results establish that pLABTarget enables the effective exploitation of Cas9 targeting in L. lactis , while the broad-host-range vector used suggests that this toolbox could readily be expanded to other Gram-positive bacteria. IMPORTANCE Mobile genetic elements in Lactococcus lactis and other lactic acid bacteria (LAB) play an important role in dairy fermentation, having both positive and detrimental effects during the production of fermented dairy products. The pLABTarget vector offers an efficient cloning platform for Cas9 application in lactic acid bacteria. Targeting Cas9 toward mobile genetic elements enabled their effective curing, which is of particular interest in the

  1. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  2. The GEM Detector projective alignment simulation system

    International Nuclear Information System (INIS)

    Wuest, C.R.; Belser, F.C.; Holdener, F.R.; Roeben, M.D.; Paradiso, J.A.; Mitselmakher, G.; Ostapchuk, A.; Pier-Amory, J.

    1993-01-01

    Precision position knowledge (< 25 microns RMS) of the GEM Detector muon system at the Superconducting Super Collider Laboratory (SSCL) is an important physics requirement necessary to minimize sagitta error in detecting and tracking high energy muons that are deflected by the magnetic field within the GEM Detector. To validate the concept of the sagitta correction function determined by projective alignment of the muon detectors (Cathode Strip Chambers or CSCs), the basis of the proposed GEM alignment scheme, a facility, called the ''Alignment Test Stand'' (ATS), is being constructed. This system simulates the environment that the CSCs and chamber alignment systems are expected to experience in the GEM Detector, albeit without the 0.8 T magnetic field and radiation environment. The ATS experimental program will allow systematic study and characterization of the projective alignment approach, as well as general mechanical engineering of muon chamber mounting concepts, positioning systems and study of the mechanical behavior of the proposed 6 layer CSCs. The ATS will consist of a stable local coordinate system in which mock-ups of muon chambers (i.e., non-working mechanical analogs, representing the three superlayers of a selected barrel and endcap alignment tower) are implemented, together with a sufficient number of alignment monitors to overdetermine the sagitta correction function, providing a self-consistency check. This paper describes the approach to be used for the alignment of the GEM muon system, the design of the ATS, and the experiments to be conducted using the ATS

  3. Cognitive functioning in socially anxious adults: Insights from the NIH Toolbox Cognition Battery

    Directory of Open Access Journals (Sweden)

    Sonya Violet Troller-Renfree

    2015-06-01

    Full Text Available Theory suggests that individuals with social anxiety manifest unique patterns of cognition with less efficient fluid cognition and unperturbed crystallized cognition; however, empirical support for these ideas remains inconclusive. The heterogeneity of past findings may reflect unreliability in cognitive assessments or the influence of confounding variables. The present study examined the relations among social anxiety and performance on the reliable, newly established NIH Toolbox Cognition Battery. Results indicate that high socially anxious adults performed as well as low anxious participants on all measures of fluid cognition. However, highly socially anxious adults demonstrated enhanced crystallized cognitive abilities relative to a low socially anxious comparison group.

  4. Simulated bat populations erode when exposed to climate change projections for western North America.

    Directory of Open Access Journals (Sweden)

    Mark A Hayes

    Full Text Available Recent research has demonstrated that temperature and precipitation conditions correlate with successful reproduction in some insectivorous bat species that live in arid and semiarid regions, and that hot and dry conditions correlate with reduced lactation and reproductive output by females of some species. However, the potential long-term impacts of climate-induced reproductive declines on bat populations in western North America are not well understood. We combined results from long-term field monitoring and experiments in our study area with information on vital rates to develop stochastic age-structured population dynamics models and analyzed how simulated fringed myotis (Myotis thysanodes populations changed under projected future climate conditions in our study area near Boulder, Colorado (Boulder Models and throughout western North America (General Models. Each simulation consisted of an initial population of 2,000 females and an approximately stable age distribution at the beginning of the simulation. We allowed each population to be influenced by the mean annual temperature and annual precipitation for our study area and a generalized range-wide model projected through year 2086, for each of four carbon emission scenarios (representative concentration pathways RCP2.6, RCP4.5, RCP6.0, RCP8.5. Each population simulation was repeated 10,000 times. Of the 8 Boulder Model simulations, 1 increased (+29.10%, 3 stayed approximately stable (+2.45%, +0.05%, -0.03%, and 4 simulations decreased substantially (-44.10%, -44.70%, -44.95%, -78.85%. All General Model simulations for western North America decreased by >90% (-93.75%, -96.70%, -96.70%, -98.75%. These results suggest that a changing climate in western North America has the potential to quickly erode some forest bat populations including species of conservation concern, such as fringed myotis.

  5. Use of Parallel Micro-Platform for the Simulation the Space Exploration

    Science.gov (United States)

    Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen

    The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.

  6. Design and Simulation to Composite PI Controller on the Stratospheric Airship

    Directory of Open Access Journals (Sweden)

    Kangwen Sun

    2014-05-01

    Full Text Available In view of the stratospheric airship application requirements on energy storage and management system, based on the topology of DC/DC converter main circuit, the composite PI controller is designed to realize respective control with the Boost mode and Buck mode. Furthermore, limit stop integration method is proposed to achieve a buck-boost complex DC/DC converter boost with effective switching buck. Then, with the MATLAB Control System Toolbox design model, the composite PI controller design and a simulation is accomplished. According to the simulation model, the structure and parameters of the controller to the system can be easily adjusted. Finally, by using the average large-signal switching mathematical model to create sub-circuit in place of the actual circuit model, the whole circuit model of the DC/DC converter is constructed with MATLAB, and then, from the analysis of simulation results, it’s proved that the method can shorten the simulation time and obtain better convergence of the target.

  7. PRANAS: A New Platform for Retinal Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Bruno Cessac

    2017-09-01

    Full Text Available The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr, standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  8. Engineering and training simulators: A combined approach for nuclear plant construction projects

    International Nuclear Information System (INIS)

    Harnois, Olivier; Gain, Pascal; Bartak, Jan; Gathmann, Ralf

    2007-01-01

    Full text: Simulation technologies have always been widely used on nuclear applications, but with a clear division between engineering application, using highly validated code run in batch mode, and training purpose where real time computation is a mandatory requirement. Thanks to the flexibility of modern simulation technology and the increased performance of computers, it becomes now possible to develop Nuclear Power plant simulators that can be used both for engineering and training purposes. In the last years, the revival of nuclear industry raised a number of new construction or plant finishing projects in which the application of this combined approach would result in decisive improvement on plant construction lead times, better project control and cost optimizations. The simulator development is to be executed in a step-wise approach, scheduled in parallel with the plant design and construction phases. During a first step, the simulator will model the plant nuclear island systems plus the corresponding instrumentation and control, specific malfunctions and local commands. It can then be used for engineering activities defining and validating the plant operating strategies in case of incidents or accidents. The Simulator executive Station and Operator Station will be in prototype version with an interface imagery enabling monitoring and control of the simulator. Availability of such simulation platform leads to a significant increase in efficiency of the engineering works, the possibility to validate basic design hypotheses and detect defects and conflicts early. The second phase will consist in the fully detailed simulation of Main Control Room plant supervision and control MMI, taking into account I and C control loops detailed design improvement, while having sufficient fidelity in order to be suitable for the future operator training. Its use will enable the engineering units not only to specify and validate normal, incident and accident detailed plant

  9. Earth simulator project. Seeking a guide line for the symbiosis between the earth (Gaia) and human beings

    International Nuclear Information System (INIS)

    Tani, Keiji; Yokokawa, Mitsuo

    2000-01-01

    In 1997, a project was started to promote the foreseeing study on the changes in global environment by Science and Technology Agency and three studies; a basic process study, observations and computer simulation have started by the project. This report outlines the development of e arth simulator , an ultra high-speed computer as a part of the project. Global warming due to human activities has been progressing on the earth. It is assumed that the mean temperature increase and sea level rising would reach 2degC and 50 cm, respectively before 2100 if the present environmental states still continue in future. There are two inevitable problems in the global environment simulation on global scale; mesh size of simulator and errors in observation data as the initial values for simulation. To accurately simulate the behaviors of thundercloud several kilometers in size, which is closely related to weather disaster, it is necessary to raise the analytic resolution by decreasing the mesh size from 20-30 km to about 1 km, resulting in several hundred times increase of the number of mesh. When compared with CRAY C90, which is the representative computer in the climate and weather field, it is thought necessary to increase the capacities of memory and calculation speed by more than one thousand times. Therefore, it seems necessary to construct a new computer including a parallel type calculator because a computer of effective speed, 5TFLOPS can not be constructed with a single processor. (M.N.)

  10. Reliability and Validity of Composite Scores from the NIH Toolbox Cognition Battery in Adults

    Science.gov (United States)

    Heaton, Robert K.; Akshoomoff, Natacha; Tulsky, David; Mungas, Dan; Weintraub, Sandra; Dikmen, Sureyya; Beaumont, Jennifer; Casaletto, Kaitlin B.; Conway, Kevin; Slotkin, Jerry; Gershon, Richard

    2014-01-01

    This study describes psychometric properties of the NIH Toolbox Cognition Battery (NIHTB-CB) Composite Scores in an adult sample. The NIHTB-CB was designed for use in epidemiologic studies and clinical trials for ages 3 to 85. A total of 268 self-described healthy adults were recruited at four university-based sites, using stratified sampling guidelines to target demographic variability for age (20–85 years), gender, education, and ethnicity. The NIHTB-CB contains seven computer-based instruments assessing five cognitive sub-domains: Language, Executive Function, Episodic Memory, Processing Speed, and Working Memory. Participants completed the NIHTB-CB, corresponding gold standard validation measures selected to tap the same cognitive abilities, and sociodemographic questionnaires. Three Composite Scores were derived for both the NIHTB-CB and gold standard batteries: “Crystallized Cognition Composite,” “Fluid Cognition Composite,” and “Total Cognition Composite” scores. NIHTB Composite Scores showed acceptable internal consistency (Cronbach’s alphas = 0.84 Crystallized, 0.83 Fluid, 0.77 Total), excellent test–retest reliability (r: 0.86–0.92), strong convergent (r: 0.78–0.90) and discriminant (r: 0.19–0.39) validities versus gold standard composites, and expected age effects (r = 0.18 crystallized, r = − 0.68 fluid, r = − 0.26 total). Significant relationships with self-reported prior school difficulties and current health status, employment, and presence of a disability provided evidence of external validity. The NIH Toolbox Cognition Battery Composite Scores have excellent reliability and validity, suggesting they can be used effectively in epidemiologic and clinical studies. PMID:24960398

  11. Simulation technology used for risky assessment in deep exploration project in China

    Science.gov (United States)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  12. Molecular Cloning Designer Simulator (MCDS: All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects

    Directory of Open Access Journals (Sweden)

    Zhenyu Shi

    2016-12-01

    Full Text Available Molecular Cloning Designer Simulator (MCDS is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1 it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2 it can perform a user-defined workflow of cloning steps in a single execution of the software; (3 it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4 it includes experimental information to conveniently guide wet lab work; and (5 it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com. Keywords: BioCAD, Genetic engineering software, Molecular cloning software, Synthetic biology, Workflow simulation and management

  13. AgrAbility Project

    Science.gov (United States)

    ... Cordless Ratchet Wrench ClampTite Wire Clamping Tool iBlue Smart Gate/Door Opener Full Toolbox AT Database Extranet ... in-person NTW - March 19-22, Portland, Maine House and Senate Appropriations Committees recommend restoring AgrAbility funding... ...

  14. Using an ensemble of climate projections for simulating recent and near-future hydrological change to lake Vaenern in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Olsson, Jonas; Yang, Wei; Graham, L. Phil; Rosberg, Joergen; Andreasson, Johan (Swedish Meteorological and Hydrological Inst., Norrkoeping (Sweden)), e-mail: jonas.olsson@smhi.se

    2011-01-15

    Lake Vaenern and River Goeta aelv in southern Sweden constitute a large and complex hydrological system that is highly vulnerable to climate change. In this study, an ensemble of 12 regional climate projections is used to simulate the inflow to Lake Vaenern by the HBV hydrological model. By using distribution based scaling of the climate model output, all projections can accurately reproduce the annual cycle of mean monthly inflows for the period 1961-1990 as simulated using HBV with observed temperature and precipitation ('HBVobs'). Significant changes towards higher winter inflow and a reduced spring flood were found when comparing the period 1991-2008 to 1961-1990 in the HBVobs simulations and the ability of the regional projections to reproduce these changes varied. The main uncertainties in the projections for 1991-2008 were found to originate from the global climate model used, including its initialization, and in one case, the emissions scenario, whereas the regional climate model used and its resolution showed a smaller influence. The projections that most accurately reproduce the recent change suggest that the current trends in the winter and spring inflows will continue over the period 2009-2030

  15. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    Science.gov (United States)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated

  16. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    International Nuclear Information System (INIS)

    2007-01-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel (Journal of Fusion Energy 20, 135 (2001)) recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts (Journal of Fusion Energy 23, 1 (2004)). The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  17. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    International Nuclear Information System (INIS)

    Kritz, A.; Keyes, D.

    2007-01-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007

  18. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  19. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  20. Start To End Simulation for the SPARX Project

    CERN Document Server

    Vaccarezza, Cristina; Boscolo, Manuela; Ferrario, Massimo; Fusco, Valeria; Giannessi, Luca; Migliorati, Mauro; Palumbo, Luigi; Quattromini, Marcello; Ronsivalle, Concetta; Serafini, Luca; Spataro, Bruno; Vescovi, Mario

    2005-01-01

    The first phase of the SPARX project now funded by Government Agencies, is an R&D activity focused on developing techniques and critical components for future X-ray facilities. The aim is the generation of electron beams with the ultra-high peak brightness required to drive FEL experiments. The FEL source realization will develop along two lines: (a) the use of the SPARC high brightness photoinjector to test RF compression techniques and the emittance degradation in magnetic compressors due to CSR, (b) the production of radiation in the range of 3-5 nm, both in SASE and SEEDED FEL configurations, in the so called SPARXINO test facility, upgrading the existing Frascati 800 MeV LINAC. In this paper we present and discuss the preliminary start to end simulations results.

  1. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  2. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  3. Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation

    Directory of Open Access Journals (Sweden)

    Kamran Shahanaghi

    2012-01-01

    Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.

  4. Development tool for PHP programs

    OpenAIRE

    Karlsen, Håkon Skaarud

    2005-01-01

    This paper discusses how a PHP development toolbox can be implemented. One toolbox has been implemented, and the implementation is described and documented in the text. The toolbox is primarily meant to help students who are taking a System Development course (INF1050) at the University of Oslo with the implementation phase of a software engineering project, but other PHP programmers may also benefit from using the toolbox. It has been emphasized that the programming interface should be i...

  5. Sonification of simulations in computational physics

    International Nuclear Information System (INIS)

    Vogt, K.

    2010-01-01

    Sonification is the translation of information for auditory perception, excluding speech itself. The cognitive performance of pattern recognition is striking for sound, and has too long been disregarded by the scientific mainstream. Examples of 'spontaneous sonification' and systematic research for about 20 years have proven that sonification provides a valuable tool for the exploration of scientific data. The data in this thesis stem from computational physics, where numerical simulations are applied to problems in physics. Prominent examples are spin models and lattice quantum field theories. The corresponding data lend themselves very well to innovative display methods: they are structured on discrete lattices, often stochastic, high-dimensional and abstract, and they provide huge amounts of data. Furthermore, they have no inher- ently perceptual dimension. When designing the sonification of simulation data, one has to make decisions on three levels, both for the data and the sound model: the level of meaning (phenomenological; metaphoric); of structure (in time and space), and of elements ('display units' vs. 'gestalt units'). The design usually proceeds as a bottom-up or top-down process. This thesis provides a 'toolbox' for helping in these decisions. It describes tools that have proven particularly useful in the context of simulation data. An explicit method of top-down sonification design is the metaphoric sonification method, which is based on expert interviews. Furthermore, qualitative and quantitative evaluation methods are presented, on the basis of which a set of evaluation criteria is proposed. The translation between a scientific and the sound synthesis domain is elucidated by a sonification operator. For this formalization, a collection of notation modules is provided. Showcases are discussed in detail that have been developed in the interdisciplinary research projects SonEnvir and QCD-audio, during the second Science By Ear workshop and during a

  6. Software development infrastructure for the HYBRID modeling and simulation project

    International Nuclear Information System (INIS)

    Epiney, Aaron S.; Kinoshita, Robert A.; Kim, Jong Suk; Rabiti, Cristian; Greenwood, M. Scott

    2016-01-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  7. Software development infrastructure for the HYBRID modeling and simulation project

    Energy Technology Data Exchange (ETDEWEB)

    Epiney, Aaron S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Greenwood, M. Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  8. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    International Nuclear Information System (INIS)

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD ampersand E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD ampersand E simulant-development task is described in this document. The key FY95 accomplishments of the RPD ampersand E simulant-development task are summarized below

  9. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    Energy Technology Data Exchange (ETDEWEB)

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  10. LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.

    Science.gov (United States)

    Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A

    2011-01-01

    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.

  11. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    Science.gov (United States)

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  12. Causal factors of low stakeholder engagement : A survey of expert opinions in the context of healthcare simulation projects

    NARCIS (Netherlands)

    Jahangirian, Mohsen; Borsci, Simone; Shah, Syed Ghulam Sarwar; Taylor, Simon J.E.

    2015-01-01

    While simulation methods have proved to be very effective in identifying efficiency gains, low stakeholder engagement creates a significant limitation on the achievement of simulation modeling projects in practice. This study reports causal factors—at two hierarchical levels (i.e., primary and

  13. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    Science.gov (United States)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  14. Quantum simulation from the bottom up: the case of rebits

    Science.gov (United States)

    Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.

    2018-05-01

    Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n  +  1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.

  15. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    International Nuclear Information System (INIS)

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE's Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP's charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program

  16. FracPaQ: a MATLAB™ toolbox for the quantification of fracture patterns

    Science.gov (United States)

    Healy, David; Rizzo, Roberto; Farrell, Natalie; Watkins, Hannah; Cornwell, David; Gomez-Rivas, Enrique; Timms, Nick

    2017-04-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying crack and fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The methods presented are inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. New features in this release include multi-scale analyses based on a wavelet method to look for scale transitions, support for multi-colour traces in the input file processed as separate fracture sets, and combining fracture traces

  17. ObsPy: A Python toolbox for seismology - Sustainability, New Features, and Applications

    Science.gov (United States)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; MacCarthy, J.

    2016-12-01

    ObsPy (https://www.obspy.org) is a community-driven, open-source project dedicated to offer a bridge for seismology into the scientific Python ecosystem. Amongst other things, it provides Read and write support for essentially every commonly used data format in seismology with a unified interface. This includes waveform data as well as station and event meta information. A signal processing toolbox tuned to the specific needs of seismologists. Integrated access to the largest data centers, web services, and databases. Wrappers around third party codes like libmseed and evalresp. Using ObsPy enables users to take advantage of the vast scientific ecosystem that has developed around Python. In contrast to many other programming languages and tools, Python is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often must be translated to stable and production ready environments, especially in the age of big data. ObsPy has seen constant development for more than six years and enjoys a large rate of adoption in the seismological community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. We will present a short overview of the capabilities of ObsPy and point out several representative use cases and more specialized software built around ObsPy. Additionally we will discuss new and upcoming features, as well as the sustainability of open-source scientific software.

  18. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    Science.gov (United States)

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  19. Simulation studies for a high resolution time projection chamber at the international linear collider

    Energy Technology Data Exchange (ETDEWEB)

    Muennich, A.

    2007-03-26

    The International Linear Collider (ILC) is planned to be the next large accelerator. The ILC will be able to perform high precision measurements only possible at the clean environment of electron positron collisions. In order to reach this high accuracy, the requirements for the detector performance are challenging. Several detector concepts are currently under study. The understanding of the detector and its performance will be crucial to extract the desired physics results from the data. To optimise the detector design, simulation studies are needed. Simulation packages like GEANT4 allow to model the detector geometry and simulate the energy deposit in the different materials. However, the detector response taking into account the transportation of the produced charge to the readout devices and the effects ofthe readout electronics cannot be described in detail. These processes in the detector will change the measured position of the energy deposit relative to the point of origin. The determination of this detector response is the task of detailed simulation studies, which have to be carried out for each subdetector. A high resolution Time Projection Chamber (TPC) with gas amplification based on micro pattern gas detectors, is one of the options for the main tracking system at the ILC. In the present thesis a detailed simulation tool to study the performance of a TPC was developed. Its goal is to find the optimal settings to reach an excellent momentum and spatial resolution. After an introduction to the present status of particle physics and the ILC project with special focus on the TPC as central tracker, the simulation framework is presented. The basic simulation methods and implemented processes are introduced. Within this stand-alone simulation framework each electron produced by primary ionisation is transferred through the gas volume and amplified using Gas Electron Multipliers (GEMs). The output format of the simulation is identical to the raw data from a

  20. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    Science.gov (United States)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  1. MOtoNMS : A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation

    NARCIS (Netherlands)

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Background: Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust

  2. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  3. Open Babel: An open chemical toolbox

    Directory of Open Access Journals (Sweden)

    O'Boyle Noel M

    2011-10-01

    Full Text Available Abstract Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language and de facto standards have arisen (for example, SMILES format, the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example, and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org.

  4. Open Babel: An open chemical toolbox

    Science.gov (United States)

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  5. FAST: FAST Analysis of Sequences Toolbox

    Directory of Open Access Journals (Sweden)

    Travis J. Lawrence

    2015-05-01

    Full Text Available FAST (FAST Analysis of Sequences Toolbox provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU’s Not Unix Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics makes FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format. Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought.

  6. CELES: CUDA-accelerated simulation of electromagnetic scattering by large ensembles of spheres

    Science.gov (United States)

    Egel, Amos; Pattelli, Lorenzo; Mazzamuto, Giacomo; Wiersma, Diederik S.; Lemmer, Uli

    2017-09-01

    CELES is a freely available MATLAB toolbox to simulate light scattering by many spherical particles. Aiming at high computational performance, CELES leverages block-diagonal preconditioning, a lookup-table approach to evaluate costly functions and massively parallel execution on NVIDIA graphics processing units using the CUDA computing platform. The combination of these techniques allows to efficiently address large electrodynamic problems (>104 scatterers) on inexpensive consumer hardware. In this paper, we validate near- and far-field distributions against the well-established multi-sphere T-matrix (MSTM) code and discuss the convergence behavior for ensembles of different sizes, including an exemplary system comprising 105 particles.

  7. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    Science.gov (United States)

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  8. Simulant composition for the Mixed Waste Management Facility (MWMF) groundwater remediation project

    International Nuclear Information System (INIS)

    Siler, J.L.

    1992-01-01

    A project has been initiated at the request of ER to study and remediate the groundwater contamination at the Mixed Waste Management Facility (MWMF). This water contains a wide variety of both inorganics (e.g., sodium) and organics (e.g., benzene, trichloroethylene). Most compounds are present in the ppB range, and certain components (e.g., trichloroethylene, silver) are present at concentrations that exceed the primary drinking water standards (PDWS). These compounds must be reduced to acceptable levels as per RCRA and CERCLA orders. This report gives a listing of the important constituents which are to be included in a simulant to model the MWMF aquifer. This simulant will be used to evaluate the feasibility of various state of the art separation/destruction processes for remediating the aquifer

  9. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    Science.gov (United States)

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  10. Improving image quality in Electrical Impedance Tomography (EIT using Projection Error Propagation-based Regularization (PEPR technique: A simulation study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-03-01

    Full Text Available A Projection Error Propagation-based Regularization (PEPR method is proposed and the reconstructed image quality is improved in Electrical Impedance Tomography (EIT. A projection error is produced due to the misfit of the calculated and measured data in the reconstruction process. The variation of the projection error is integrated with response matrix in each iterations and the reconstruction is carried out in EIDORS. The PEPR method is studied with the simulated boundary data for different inhomogeneity geometries. Simulated results demonstrate that the PEPR technique improves image reconstruction precision in EIDORS and hence it can be successfully implemented to increase the reconstruction accuracy in EIT.>doi:10.5617/jeb.158 J Electr Bioimp, vol. 2, pp. 2-12, 2011

  11. Joint implementation - project simulation and organisation. Operationalization of a new instrument of climate policy

    International Nuclear Information System (INIS)

    Luhmann, H.J.; Ott, H.E.; Beuermann, C.; Fischedick, M.; Hennicke, P.; Bakker, L.

    1997-01-01

    The study served to analyze the practical aspects of 'Joint Implementation' (JI) under the Framework Convention on Climate Change, in which measures to reduce greenhouse gas emissions are carried out jointly by several countries. As a first step, possible climate protection measures were assessed with respect to their suitability for JI and an overview of suitable JI projects was compiled. In a further step, carried out in cooperation with partners from industry, four specific projects (coal-fired power plant, solar-thermal power plant, cement factory, least-cost planning) were used to simulate JI. In this work, solutions were developed for the calculation of greenhouse gas emissions avoided as well as for project reporting. In addition, proposals were made with respect to the organizational and institutional design on an international JI mechanism. (orig.) [de

  12. Methodology for neural networks prototyping. Application to traffic control

    Energy Technology Data Exchange (ETDEWEB)

    Belegan, I.C.

    1998-07-01

    The work described in this report was carried out in the context of the European project ASTORIA (Advanced Simulation Toolbox for Real-World Industrial Application in Passenger Management and Adaptive Control), and concerns the development of an advanced toolbox for complex transportation systems. Our work was focused on the methodology for prototyping a set of neural networks corresponding to specific strategies for traffic control and congestion management. The tool used for prototyping is SNNS (Stuggart Neural Network Simulator), developed at the University of Stuggart, Institute for Parallel and Distributed High Performance Systems, and the real data from the field were provided by ZELT. This report is structured into six parts. The introduction gives some insights about traffic control and its approaches. The second chapter discusses the various control strategies existing. The third chapter is an introduction to the field of neural networks. The data analysis and pre-processing is described in the fourth chapter. In the fifth chapter, the methodology for prototyping the neural networks is presented. Finally, conclusions and further work are presented. (author) 14 refs.

  13. SIROCCO project: 15 advanced instructor desk and 4 simulated control room for 900MW and 1300MW EDF power plant simulators

    International Nuclear Information System (INIS)

    Alphonse, J.; Roth, P.; Sicard, Y.; Rudelli, P.

    2006-01-01

    This presentation describes the fifteen advanced instructors station and four simulated control delivered to EDF in the frame of the SIROCCO project by the Consortium formed by ATOS Origin, CORYS Tess, for the Electricite de France (EDF). These instructor stations are installed on fifteen replica training simulators located on different sites throughout France for the purposes of improving the job-related training of the EDF PWR nuclear power plant operating teams. This covers all 900 MW and 1300MW nuclear power plant of EDF. The simulated control rooms are installed on maintenance platform located at EDF and the consortium facilities. The consortium uses it to maintain and upgrade the simulators. EDF uses it to validate the upgrade delivered by the consortium before on site installation and to perform engineering analysis. This presentation sets out successively: - The major advantages of the generic and configurable connected module concept for flexible and quick adaptation to different simulators; - The innovative functionalities of the advanced Instructor Desk (IS) which make the instructor's tasks of preparation, monitoring and postanalysis of a training session easier and more homogeneous; - The use of the Simulated Control Room (SCR) for training purposes but also for those of maintenance and design studies for upgrades of existing control rooms

  14. A general XML schema and SPM toolbox for storage of neuro-imaging results and anatomical labels.

    Science.gov (United States)

    Keator, David Bryant; Gadde, Syam; Grethe, Jeffrey S; Taylor, Derek V; Potkin, Steven G

    2006-01-01

    With the increased frequency of multisite, large-scale collaborative neuro-imaging studies, the need for a general, self-documenting framework for the storage and retrieval of activation maps and anatomical labels becomes evident. To address this need, we have developed and extensible markup language (XML) schema and associated tools for the storage of neuro-imaging activation maps and anatomical labels. This schema, as part of the XML-based Clinical Experiment Data Exchange (XCEDE) schema, provides storage capabilities for analysis annotations, activation threshold parameters, and cluster and voxel-level statistics. Activation parameters contain information describing the threshold, degrees of freedom, FWHM smoothness, search volumes, voxel sizes, expected voxels per cluster, and expected number of clusters in the statistical map. Cluster and voxel statistics can be stored along with the coordinates, threshold, and anatomical label information. Multiple threshold types can be documented for a given cluster or voxel along with the uncorrected and corrected probability values. Multiple atlases can be used to generate anatomical labels and stored for each significant voxel or cluter. Additionally, a toolbox for Statistical Parametric Mapping software (http://www. fil. ion.ucl.ac.uk/spm/) was created to capture the results from activation maps using the XML schema that supports both SPM99 and SPM2 versions (http://nbirn.net/Resources/Users/ Applications/xcede/SPM_XMLTools.htm). Support for anatomical labeling is available via the Talairach Daemon (http://ric.uthscsa. edu/projects/talairachdaemon.html) and Automated Anatomical Labeling (http://www. cyceron.fr/freeware/).

  15. On the influence of simulated SST warming on rainfall projections in the Indo-Pacific domain: an AGCM study

    Science.gov (United States)

    Zhang, Huqiang; Zhao, Y.; Moise, A.; Ye, H.; Colman, R.; Roff, G.; Zhao, M.

    2018-02-01

    Significant uncertainty exists in regional climate change projections, particularly for rainfall and other hydro-climate variables. In this study, we conduct a series of Atmospheric General Circulation Model (AGCM) experiments with different future sea surface temperature (SST) warming simulated by a range of coupled climate models. They allow us to assess the extent to which uncertainty from current coupled climate model rainfall projections can be attributed to their simulated SST warming. Nine CMIP5 model-simulated global SST warming anomalies have been super-imposed onto the current SSTs simulated by the Australian climate model ACCESS1.3. The ACCESS1.3 SST-forced experiments closely reproduce rainfall means and interannual variations as in its own fully coupled experiments. Although different global SST warming intensities explain well the inter-model difference in global mean precipitation changes, at regional scales the SST influence vary significantly. SST warming explains about 20-25% of the patterns of precipitation changes in each of the four/five models in its rainfall projections over the oceans in the Indo-Pacific domain, but there are also a couple of models in which different SST warming explains little of their precipitation pattern changes. The influence is weaker again for rainfall changes over land. Roughly similar levels of contribution can be attributed to different atmospheric responses to SST warming in these models. The weak SST influence in our study could be due to the experimental setup applied: superimposing different SST warming anomalies onto the same SSTs simulated for current climate by ACCESS1.3 rather than directly using model-simulated past and future SSTs. Similar modelling and analysis from other modelling groups with more carefully designed experiments are needed to tease out uncertainties caused by different SST warming patterns, different SST mean biases and different model physical/dynamical responses to the same underlying

  16. Real-time "x-ray vision" for healthcare simulation: an interactive projective overlay system to enhance intubation training and other procedural training.

    Science.gov (United States)

    Samosky, Joseph T; Baillargeon, Emma; Bregman, Russell; Brown, Andrew; Chaya, Amy; Enders, Leah; Nelson, Douglas A; Robinson, Evan; Sukits, Alison L; Weaver, Robert A

    2011-01-01

    We have developed a prototype of a real-time, interactive projective overlay (IPO) system that creates augmented reality display of a medical procedure directly on the surface of a full-body mannequin human simulator. These images approximate the appearance of both anatomic structures and instrument activity occurring within the body. The key innovation of the current work is sensing the position and motion of an actual device (such as an endotracheal tube) inserted into the mannequin and using the sensed position to control projected video images portraying the internal appearance of the same devices and relevant anatomic structures. The images are projected in correct registration onto the surface of the simulated body. As an initial practical prototype to test this technique we have developed a system permitting real-time visualization of the intra-airway position of an endotracheal tube during simulated intubation training.

  17. Expanding the toolbox for studying the biological responses of individual fish to hydropower infrastructure and operating strategies

    International Nuclear Information System (INIS)

    Hasler, C.T.; Cooke, S.J.; Patterson, D.A.

    2009-01-01

    Hydropower infrastructure and the operational strategies used by power utilities have the potential to change local aquatic environments. However, few studies have evaluated sub-organismal responses such as physiological consequences of individual fish to fluctuating flows or hydropower infrastructure such as fishways or turbines. Rather than review the impacts of hydropower on fish, this paper detailed the behavioural, energetic, genomic, molecular, forensic, isotopic, and physiological tools available for studying sub-organismal responses of fish to hydropower infrastructure and operating procedures with a critical assessment of their benefits and limitations. A brief summary of the current state of knowledge regarding the 12 types of tools was provided along with their usefulness in fisheries science and environmental management. The benefits and limitations of using these techniques for evaluating hydropower impacts on fish and fish habitat were discussed. Two case studies were presented to demonstrate how the inclusion of individual-based information into hydropower research has helped to improve the understanding of complex fish and hydropower issues. Practitioners can use the expanded toolbox to assess fishway performance, migration delays, and fish responses to fluctuating flows through a mechanistic approach. These tools are also relevant for evaluating other anthropogenic impacts such as water withdrawal for irrigation or drinking water, habitat alteration, and fisheries interactions. The expanded toolbox can contribute to a more sustainable hydropower industry by providing regulators with tools for making informed decisions and evaluating compliance issues. 150 refs., 1 tab., 4 figs

  18. Neutron tomography using projection data obtained by Monte Carlo simulation for nondestructive evaluation

    International Nuclear Information System (INIS)

    Silva, A.X. da; Crispim, V.R.

    2002-01-01

    This work present the application of a computer package for generating of projection data for neutron computerized tomography, and in second part, discusses an application of neutron tomography, using the projection data obtained by Monte Carlo technique, for the detection and localization of light materials such as those containing hydrogen, concealed by heavy materials such as iron and lead. For tomographic reconstructions of the samples simulated use was made of only six equal projection angles distributed between 0 deg C and 180 deg C, with reconstruction making use of an algorithm (ARIEM), based on the principle of maximum entropy. With the neutron tomography it was possible to detect and locate polyethylene and water hidden by lead and iron (with 1 cm-thick). Thus, it is demonstrated that thermal neutrons tomography is a viable test method which can provide important interior information about test components, so, extremely useful in routine industrial applications.(author)

  19. Projection of future climate change conditions using IPCC simulations, neural networks and Bayesian statistics. Part 2: Precipitation mean state and seasonal cycle in South America

    Energy Technology Data Exchange (ETDEWEB)

    Boulanger, Jean-Philippe [LODYC, UMR CNRS/IRD/UPMC, Tour 45-55/Etage 4/Case 100, UPMC, Paris Cedex 05 (France); University of Buenos Aires, Departamento de Ciencias de la Atmosfera y los Oceanos, Facultad de Ciencias Exactas y Naturales, Buenos Aires (Argentina); Martinez, Fernando; Segura, Enrique C. [University of Buenos Aires, Departamento de Computacion, Facultad de Ciencias Exactas y Naturales, Buenos Aires (Argentina)

    2007-02-15

    Evaluating the response of climate to greenhouse gas forcing is a major objective of the climate community, and the use of large ensemble of simulations is considered as a significant step toward that goal. The present paper thus discusses a new methodology based on neural network to mix ensemble of climate model simulations. Our analysis consists of one simulation of seven Atmosphere-Ocean Global Climate Models, which participated in the IPCC Project and provided at least one simulation for the twentieth century (20c3m) and one simulation for each of three SRES scenarios: A2, A1B and B1. Our statistical method based on neural networks and Bayesian statistics computes a transfer function between models and observations. Such a transfer function was then used to project future conditions and to derive what we would call the optimal ensemble combination for twenty-first century climate change projections. Our approach is therefore based on one statement and one hypothesis. The statement is that an optimal ensemble projection should be built by giving larger weights to models, which have more skill in representing present climate conditions. The hypothesis is that our method based on neural network is actually weighting the models that way. While the statement is actually an open question, which answer may vary according to the region or climate signal under study, our results demonstrate that the neural network approach indeed allows to weighting models according to their skills. As such, our method is an improvement of existing Bayesian methods developed to mix ensembles of simulations. However, the general low skill of climate models in simulating precipitation mean climatology implies that the final projection maps (whatever the method used to compute them) may significantly change in the future as models improve. Therefore, the projection results for late twenty-first century conditions are presented as possible projections based on the &apos

  20. Universality and Realistic Extensions to the Semi-Analytic Simulation Principle in GNSS Signal Processing

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2012-06-01

    Full Text Available Semi-analytic simulation principle in GNSS signal processing bypasses the bit-true operations at high sampling frequency. Instead, signals at the output branches of the integrate&dump blocks are successfully modeled, thus making extensive Monte Carlo simulations feasible. Methods for simulations of code and carrier tracking loops with BPSK, BOC signals have been introduced in the literature. Matlab toolboxes were designed and published. In this paper, we further extend the applicability of the approach. Firstly, we describe any GNSS signal as a special instance of linear multi-dimensional modulation. Thereby, we state universal framework for classification of differently modulated signals. Using such description, we derive the semi-analytic models generally. Secondly, we extend the model for realistic scenarios including delay in the feed back, slowly fading multipath effects, finite bandwidth, phase noise, and a combination of these. Finally, a discussion on connection of this semi-analytic model and position-velocity-time estimator is delivered, as well as comparison of theoretical and simulated characteristics, produced by a prototype simulator developed at CTU in Prague.