WorldWideScience

Sample records for accelerator modeling toolkit

  1. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  2. Report of the Los Alamos accelerator automation application toolkit workshop

    International Nuclear Information System (INIS)

    Clout, P.; Daneels, A.

    1990-01-01

    A 5 day workshop was held in November 1988 at Los Alamos National Laboratory to address the viability of providing a toolkit optimized for building accelerator control systems. The workshop arose from work started independently at Los Alamos and CERN. This paper presents the discussion and the results of the meeting. (orig.)

  3. A universal postprocessing toolkit for accelerator simulation and data analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1998-01-01

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation

  4. Applications toolkit for accelerator control and analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1997-01-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator operation and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provide a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 70 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creating of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This last factor allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to operational tasks such as orbit correction, configuration management, and data review will be discussed

  5. Energy retrofit analysis toolkits for commercial buildings: A review

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Taylor-Lange, Sarah C.

    2015-01-01

    Retrofit analysis toolkits can be used to optimize energy or cost savings from retrofit strategies, accelerating the adoption of ECMs (energy conservation measures) in buildings. This paper provides an up-to-date review of the features and capabilities of 18 energy retrofit toolkits, including ECMs and the calculation engines. The fidelity of the calculation techniques, a driving component of retrofit toolkits, were evaluated. An evaluation of the issues that hinder effective retrofit analysis in terms of accessibility, usability, data requirement, and the application of efficiency measures, provides valuable insights into advancing the field forward. Following this review the general concepts were determined: (1) toolkits developed primarily in the private sector use empirically data-driven methods or benchmarking to provide ease of use, (2) almost all of the toolkits which used EnergyPlus or DOE-2 were freely accessible, but suffered from complexity, longer data input and simulation run time, (3) in general, there appeared to be a fine line between having too much detail resulting in a long analysis time or too little detail which sacrificed modeling fidelity. These insights provide an opportunity to enhance the design and development of existing and new retrofit toolkits in the future. - Highlights: • Retrofit analysis toolkits can accelerate the adoption of energy efficiency measures. • A comprehensive review of 19 retrofit analysis toolkits was conducted. • Retrofit toolkits have diverse features, data requirement and computing methods. • Empirical data-driven, normative and detailed energy modeling methods are used. • Identified immediate areas for improvement for retrofit analysis toolkits

  6. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  7. A portable accelerator control toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Watson, W.A. III

    1997-06-01

    In recent years, the expense of creating good control software has led to a number of collaborative efforts among laboratories to share this cost. The EPICS collaboration is a particularly successful example of this trend. More recently another collaborative effort has addressed the need for sophisticated high level software, including model driven accelerator controls. This work builds upon the CDEV (Common DEVice) software framework, which provides a generic abstraction of a control system, and maps that abstraction onto a number of site-specific control systems including EPICS, the SLAC control system, CERN/PS and others. In principle, it is now possible to create portable accelerator control applications which have no knowledge of the underlying and site-specific control system. Applications based on CDEV now provide a growing suite of tools for accelerator operations, including general purpose displays, an on-line accelerator model, beamline steering, machine status displays incorporating both hardware and model information (such as beam positions overlaid with beta functions) and more. A survey of CDEV compatible portable applications will be presented, as well as plans for future development.

  8. A portable accelerator control toolkit

    International Nuclear Information System (INIS)

    Watson, W.A. III.

    1997-01-01

    In recent years, the expense of creating good control software has led to a number of collaborative efforts among laboratories to share this cost. The EPICS collaboration is a particularly successful example of this trend. More recently another collaborative effort has addressed the need for sophisticated high level software, including model driven accelerator controls. This work builds upon the CDEV (Common DEVice) software framework, which provides a generic abstraction of a control system, and maps that abstraction onto a number of site-specific control systems including EPICS, the SLAC control system, CERN/PS and others. In principle, it is now possible to create portable accelerator control applications which have no knowledge of the underlying and site-specific control system. Applications based on CDEV now provide a growing suite of tools for accelerator operations, including general purpose displays, an on-line accelerator model, beamline steering, machine status displays incorporating both hardware and model information (such as beam positions overlaid with beta functions) and more. A survey of CDEV compatible portable applications will be presented, as well as plans for future development

  9. An Overview of the GEANT4 Toolkit

    International Nuclear Information System (INIS)

    Apostolakis, John; CERN; Wright, Dennis H.

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments

  10. Geant4 - A Simulation Toolkit

    International Nuclear Information System (INIS)

    2002-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics

  11. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  12. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  13. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    Science.gov (United States)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  14. The self-describing data sets file protocol and Toolkit

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.

    1995-01-01

    The Self-Describing Data Sets (SDDS) file protocol continues to be used extensively in commissioning the Advanced Photon Source (APS) accelerator complex. SDDS protocol has proved useful primarily due to the existence of the SDDS Toolkit, a growing set of about 60 generic commandline programs that read and/or write SDDS files. The SDDS Toolkit is also used extensively for simulation postprocessing, giving physicists a single environment for experiment and simulation. With the Toolkit, new SDDS data is displayed and subjected to complex processing without developing new programs. Data from EPICS, lab instruments, simulation, and other sources are easily integrated. Because the SDDS tools are commandline-based, data processing scripts are readily written using the user's preferred shell language. Since users work within a UNIX shell rather than an application-specific shell or GUI, they may add SDDS-compliant programs and scripts to their personal toolkits without restriction or complication. The SDDS Toolkit has been run under UNIX on SUN OS4, HP-UX, and LINUX. Application of SDDS to accelerator operation is being pursued using Tcl/Tk to provide a GUI

  15. Roofline model toolkit: A practical tool for architectural and program analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Yu Jung [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ligocki, Terry J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cordery, Matthew J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hall, Mary W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-18

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measure sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.

  16. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    1997-01-01

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  17. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  18. Solar Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration National Dataset Toolkit Solar Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S . regional solar generation integration studies. It will provide modeled, coherent subhourly solar power data

  19. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  20. Livermore Big Artificial Neural Network Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  1. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  2. SIGKit: Software for Introductory Geophysics Toolkit

    Science.gov (United States)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  3. Making Schools the Model for Healthier Environments Toolkit: What It Is

    Science.gov (United States)

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  4. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    Science.gov (United States)

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  5. An Accelerated Testing Approach for Automated Vehicles with Background Traffic Described by Joint Distributions

    OpenAIRE

    Huang, Zhiyuan; Lam, Henry; Zhao, Ding

    2017-01-01

    This paper proposes a new framework based on joint statistical models for evaluating risks of automated vehicles in a naturalistic driving environment. The previous studies on the Accelerated Evaluation for automated vehicles are extended from multi-independent-variate models to joint statistics. The proposed toolkit includes exploration of the rare event (e.g. crash) sets and construction of accelerated distributions for Gaussian Mixture models using Importance Sampling techniques. Furthermo...

  6. Tribal Green Building Toolkit

    Science.gov (United States)

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  7. An Overview of the Geant4 Toolkit

    CERN Document Server

    Apostolakis, John

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation trough matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. App...

  8. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  9. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  10. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fields, Laura [Fermilab; Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-08-21

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. This raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.

  11. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    International Nuclear Information System (INIS)

    Zhou, Y.; Zhu, X.; Wang, Y.; Mitra, S.

    2012-01-01

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C 3 H 6 O 6 N 6 ), can be identified to a depth of 20 cm when buried in soil. (author)

  12. Using the Model Coupling Toolkit to couple earth system models

    Science.gov (United States)

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  13. Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.

    Science.gov (United States)

    Beehler, Gregory P; Lilienthal, Kaitlin R

    2017-02-01

    The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Perl Template Toolkit

    CERN Document Server

    Chamberlain, Darren; Cross, David; Torkington, Nathan; Diaz, tatiana Apandi

    2004-01-01

    Among the many different approaches to "templating" with Perl--such as Embperl, Mason, HTML::Template, and hundreds of other lesser known systems--the Template Toolkit is widely recognized as one of the most versatile. Like other templating systems, the Template Toolkit allows programmers to embed Perl code and custom macros into HTML documents in order to create customized documents on the fly. But unlike the others, the Template Toolkit is as facile at producing HTML as it is at producing XML, PDF, or any other output format. And because it has its own simple templating language, templates

  15. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  16. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We compare hand-crafted custom code to polylithic and monolithic toolkit-based solutions. Polylithic toolkits follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor...

  17. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  18. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  19. Monte Carlo application based on GEANT4 toolkit to simulate a laser–plasma electron beam line for radiobiological studies

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, D., E-mail: debora.lamia@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Russo, G., E-mail: giorgio.russo@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Casarino, C.; Gagliano, L.; Candiano, G.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Labate, L. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); Gizzi, L.A. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Gilardi, M.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR, Segrate (Italy); University of Milano-Bicocca, Milano (Italy)

    2015-06-21

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system.

  20. Outage Risk Assessment and Management (ORAM) thermal-hydraulics toolkit

    International Nuclear Information System (INIS)

    Denny, V.E.; Wassel, A.T.; Issacci, F.; Pal Kalra, S.

    2004-01-01

    A PC-based thermal-hydraulic toolkit for use in support of outage optimization, management and risk assessment has been developed. This mechanistic toolkit incorporates simple models of key thermal-hydraulic processes which occur during an outage, such as recovery from or mitigation of outage upsets; this includes heat-up of water pools following loss of shutdown cooling, inadvertent drain down of the RCS, boiloff of coolant inventory, heatup of the uncovered core, and reflux cooling. This paper provides a list of key toolkit elements, briefly describes the technical basis and presents illustrative results for RCS transient behavior during reflux cooling, peak clad temperatures for an uncovered core and RCS response to loss of shutdown cooling. (author)

  1. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.

  2. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    Science.gov (United States)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  3. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  4. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  5. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    Science.gov (United States)

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  6. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims to communi......The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...... to communicate best practices in conserving biodiversity and sustaining ecosystem services to potential users and to promote the wise-use of aquatic resources, improve livelihoods and enhance policy information....

  7. Practical computational toolkits for dendrimers and dendrons structure design

    Science.gov (United States)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  8. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    Directory of Open Access Journals (Sweden)

    Hutchison Geoffrey R

    2008-12-01

    Full Text Available Abstract Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java, have different underlying chemical models and have different application programming interfaces (APIs. Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit.

  9. Supporting LGBT Communities: Police ToolKit

    OpenAIRE

    Vasquez del Aguila, Ernesto; Franey, Paul

    2013-01-01

    This toolkit provides police forces with practical educational tools, which can be used as part of a comprehensive LGBT strategy centred on diversity, equality, and non-discrimination. These materials are based on lessons learned through real life policing experiences with LGBT persons. The Toolkit is divided into seven scenarios where police awareness of LGBT issues has been identified as important. The toolkit employs a practical, scenario-based, problem-solving approach to help police offi...

  10. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  11. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We describe Jazz (a polylithic toolkit) and Piccolo (a monolithic toolkit), each of which we built to support interactive 2D structured graphics applications in general, and Zoomable User Interface applications in particular...

  12. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    Science.gov (United States)

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  13. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  14. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  15. Local Safety Toolkit: Enabling safe communities of opportunity

    CSIR Research Space (South Africa)

    Holtmann, B

    2010-08-31

    Full Text Available remain inadequate to achieve safety. The Local Safety Toolkit supports a strategy for a Safe South Africa through the implementation of a model for a Safe Community of Opportunity. The model is the outcome of work undertaken over the course of the past...

  16. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  17. Wind Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and Western Wind Integration Data Set. It supports the next generation of wind integration studies. WIND

  18. PS1-29: Resources to Facilitate Multi-site Collaboration: the PRIMER Research Toolkit

    Science.gov (United States)

    Greene, Sarah; Thompson, Ella; Baldwin, Laura-Mae; Neale, Anne Victoria; Dolor, Rowena

    2010-01-01

    Background and Aims: The national research enterprise has typically functioned in a decentralized fashion, resulting in duplicative or undocumented processes, impeding not only the pace of research, but diffusion of established best practices. To remedy this, many long-standing networks have begun capturing and documenting proven strategies to streamline and standardize various aspects of the research process. The project, “Partnership-driven Resources to IMprove and Enhance Research” (PRIMER), was funded through the Clinical and Translational Science Awards (CTSA) initiative to leverage the collective expertise from two networks: the HMO Research Network and Practice Based Research Networks (PBRNs). Each network has a shared goal of propagating research resources and best practices. Methods: We created and distributed an online survey to 92 CTSA and PBRN representatives in March, 2009 to define critical needs and existing resources that could inform a resource repository. The survey identified barriers and benefits to forming research partnerships, and assessed the perceived utility of various tools that could accelerate the research process. The study team identified, reviewed and organized tools based on the typical research trajectory from design to dissemination. Results: Fifty-five of 92 invitees (59%) completed the survey. Respondents rated the ability to conduct community-relevant research through true academic-community partnerships as the top-rated benefit of multi-site research, followed by the opportunity to accelerate translation of research into practice. The top two perceived barriers to multi-site research were ‘funding opportunities are not adequate (e.g., too few, not enough to support true collaborations), and ‘lack of research infrastructure to support [all] partners (e.g., no IT support, IRB, dedicated research staff). Respondents’ ratings of the utility of various tools and templates was used to guide development of an online

  19. A qualitative study of clinic and community member perspectives on intervention toolkits: "Unless the toolkit is used it won't help solve the problem".

    Science.gov (United States)

    Davis, Melinda M; Howk, Sonya; Spurlock, Margaret; McGinnis, Paul B; Cohen, Deborah J; Fagnan, Lyle J

    2017-07-18

    Intervention toolkits are common products of grant-funded research in public health and primary care settings. Toolkits are designed to address the knowledge translation gap by speeding implementation and dissemination of research into practice. However, few studies describe characteristics of effective intervention toolkits and their implementation. Therefore, we conducted this study to explore what clinic and community-based users want in intervention toolkits and to identify the factors that support application in practice. In this qualitative descriptive study we conducted focus groups and interviews with a purposive sample of community health coalition members, public health experts, and primary care professionals between November 2010 and January 2012. The transdisciplinary research team used thematic analysis to identify themes and a cross-case comparative analysis to explore variation by participant role and toolkit experience. Ninety six participants representing primary care (n = 54, 56%) and community settings (n = 42, 44%) participated in 18 sessions (13 focus groups, five key informant interviews). Participants ranged from those naïve through expert in toolkit development; many reported limited application of toolkits in actual practice. Participants wanted toolkits targeted at the right audience and demonstrated to be effective. Well organized toolkits, often with a quick start guide, with tools that were easy to tailor and apply were desired. Irrespective of perceived quality, participants experienced with practice change emphasized that leadership, staff buy-in, and facilitative support was essential for intervention toolkits to be translated into changes in clinic or public -health practice. Given the emphasis on toolkits in supporting implementation and dissemination of research and clinical guidelines, studies are warranted to determine when and how toolkits are used. Funders, policy makers, researchers, and leaders in primary care and

  20. Integrated System Health Management Development Toolkit

    Science.gov (United States)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  1. Computational Chemistry Toolkit for Energetic Materials Design

    Science.gov (United States)

    2006-11-01

    industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6

  2. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  3. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  4. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    Science.gov (United States)

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  5. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  6. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  7. Measuring employee satisfaction in new offices - the WODI toolkit

    NARCIS (Netherlands)

    Maarleveld, M.; Volker, L.; van der Voordt, Theo

    2009-01-01

    Purpose: This paper presents a toolkit to measure employee satisfaction and perceived labour productivity as affected by different workplace strategies. The toolkit is being illustrated by a case study of the Dutch Revenue Service.
    Methodology: The toolkit has been developed by a review of

  8. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    Science.gov (United States)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  9. Transportation librarian's toolkit

    Science.gov (United States)

    2007-12-01

    The Transportation Librarians Toolkit is a product of the Transportation Library Connectivity pooled fund study, TPF- 5(105), a collaborative, grass-roots effort by transportation libraries to enhance information accessibility and professional expert...

  10. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  11. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  12. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  13. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  14. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  15. Workshop Engages PCs in Accelerator Controls

    International Nuclear Information System (INIS)

    Matthew Bickley

    2006-01-01

    To discuss the rapidly growing and changing use of personal computers (PCs) in accelerator control systems, 80 accelerator controls specialists from 26 institutions in North America, Europe and Asia attended the 6. International Workshop on Personal Computers and Particle Accelerator Controls, PCaPAC2006, held October 24-27 at Jefferson Lab in Newport News, Virginia. PCs have become increasingly applicable to the control of accelerators as their computing capacities have increased exponentially over the last 10 years. Capabilities that once required the power available only from expensive, small-market systems offered by DEC, Sun or IBM can now be obtained with commodity hardware offered by many vendors. The price/performance ratio presented by any standard PC makes a compelling case for using PC hardware in accelerator controls wherever possible. The PCaPAC meeting underscored the importance of collaborative control system development. Several talks focused on additions to three such systems, TINE, TANGO and EPICS. The diverse contributions to these toolkits, both in content and source, demonstrate the power of leveraged software development across a number of facilities. TINE originated in DESY's desire to give users a unified software bus above disparate underlying platforms. TINE discussions at PCaPAC centered on the toolkit's interface layers, including address redirection and integration with other control systems. TANGO has been a collaborative effort from its inception. Based on CORBA, this open-source controls toolkit is a registered project in the source forge system. The workshop TANGO presentation discussed contributions from four TANGO institutions, and mentioned a broad range of new tools, from user interface applications to code generators and database integration software. EPICS, which was started at LANL in the 1980s, includes contributions from dozens of institutions around the world. EPICS-related PCaPAC discussions included virtual machines at

  16. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  17. X-CSIT: a toolkit for simulating 2D pixel detectors

    Science.gov (United States)

    Joy, A.; Wing, M.; Hauf, S.; Kuster, M.; Rüter, T.

    2015-04-01

    A new, modular toolkit for creating simulations of 2D X-ray pixel detectors, X-CSIT (X-ray Camera SImulation Toolkit), is being developed. The toolkit uses three sequential simulations of detector processes which model photon interactions, electron charge cloud spreading with a high charge density plasma model and common electronic components used in detector readout. In addition, because of the wide variety in pixel detector design, X-CSIT has been designed as a modular platform so that existing functions can be modified or additional functionality added if the specific design of a detector demands it. X-CSIT will be used to create simulations of the detectors at the European XFEL, including three bespoke 2D detectors: the Adaptive Gain Integrating Pixel Detector (AGIPD), Large Pixel Detector (LPD) and DePFET Sensor with Signal Compression (DSSC). These simulations will be used by the detector group at the European XFEL for detector characterisation and calibration. For this purpose, X-CSIT has been integrated into the European XFEL's software framework, Karabo. This will further make it available to users to aid with the planning of experiments and analysis of data. In addition, X-CSIT will be released as a standalone, open source version for other users, collaborations and groups intending to create simulations of their own detectors.

  18. An Industrial Physics Toolkit

    Science.gov (United States)

    Cummings, Bill

    2004-03-01

    Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.

  19. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing

    DEFF Research Database (Denmark)

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai

    2018-01-01

    engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification......, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector...... construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes....

  20. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  1. VIDE: The Void IDentification and Examination toolkit

    Science.gov (United States)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N-body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at http://bitbucket.org/cosmicvoids/vide_public and http://www.cosmicvoids.net.

  2. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  3. Commercial Building Energy Saver: An energy retrofit analysis toolkit

    International Nuclear Information System (INIS)

    Hong, Tianzhen; Piette, Mary Ann; Chen, Yixing; Lee, Sang Hoon; Taylor-Lange, Sarah C.; Zhang, Rongpeng; Sun, Kaiyu; Price, Phillip

    2015-01-01

    Highlights: • Commercial Building Energy Saver is a powerful toolkit for energy retrofit analysis. • CBES provides benchmarking, load shape analysis, and model-based retrofit assessment. • CBES covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • CBES includes a web app, API, and a database of energy efficiency performance. • CBES API can be extended and integrated with third party energy software tools. - Abstract: Small commercial buildings in the United States consume 47% of the total primary energy of the buildings sector. Retrofitting small and medium commercial buildings poses a huge challenge for owners because they usually lack the expertise and resources to identify and evaluate cost-effective energy retrofit strategies. This paper presents the Commercial Building Energy Saver (CBES), an energy retrofit analysis toolkit, which calculates the energy use of a building, identifies and evaluates retrofit measures in terms of energy savings, energy cost savings and payback. The CBES Toolkit includes a web app (APP) for end users and the CBES Application Programming Interface (API) for integrating CBES with other energy software tools. The toolkit provides a rich set of features including: (1) Energy Benchmarking providing an Energy Star score, (2) Load Shape Analysis to identify potential building operation improvements, (3) Preliminary Retrofit Analysis which uses a custom developed pre-simulated database and, (4) Detailed Retrofit Analysis which utilizes real-time EnergyPlus simulations. CBES includes 100 configurable energy conservation measures (ECMs) that encompass IAQ, technical performance and cost data, for assessing 7 different prototype buildings in 16 climate zones in California and 6 vintages. A case study of a small office building demonstrates the use of the toolkit for retrofit analysis. The development of CBES provides a new contribution to the field by providing a straightforward and uncomplicated decision

  4. PODIO: An Event-Data-Model Toolkit for High Energy Physics Experiments

    Science.gov (United States)

    Gaede, F.; Hegner, B.; Mato, P.

    2017-10-01

    PODIO is a C++ library that supports the automatic creation of event data models (EDMs) and efficient I/O code for HEP experiments. It is developed as a new EDM Toolkit for future particle physics experiments in the context of the AIDA2020 EU programme. Experience from LHC and the linear collider community shows that existing solutions partly suffer from overly complex data models with deep object-hierarchies or unfavorable I/O performance. The PODIO project was created in order to address these problems. PODIO is based on the idea of employing plain-old-data (POD) data structures wherever possible, while avoiding deep object-hierarchies and virtual inheritance. At the same time it provides the necessary high-level interface towards the developer physicist, such as the support for inter-object relations and automatic memory-management, as well as a Python interface. To simplify the creation of efficient data models PODIO employs code generation from a simple yaml-based markup language. In addition, it was developed with concurrency in mind in order to support the use of modern CPU features, for example giving basic support for vectorization techniques.

  5. Electromagnetic modeling in accelerator designs

    International Nuclear Information System (INIS)

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described

  6. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available of such a toolkit facilitates and increases productivity during subsequent tool development: “develop once and use many times”. The concept of an extendible toolkit lends itself naturally to the open-source philosophy, where the toolkit user-base develops...

  7. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    Science.gov (United States)

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  8. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  9. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    Science.gov (United States)

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the

  10. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  11. An open source toolkit for medical imaging de-identification

    International Nuclear Information System (INIS)

    Rodriguez Gonzalez, David; Carpenter, Trevor; Wardlaw, Joanna; Hemert, Jano I. van

    2010-01-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users. (orig.)

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  13. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Science.gov (United States)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  14. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  15. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  16. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  17. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT

    Directory of Open Access Journals (Sweden)

    Mair Frances

    2010-10-01

    Full Text Available Abstract Background The use of Information and Communication Technology (ICT or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice. This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience. Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations.

  18. Application experiences with the Globus toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  19. GENFIT - a generic track-fitting toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Johannes [Technische Universitaet Muenchen (Germany); Schlueter, Tobias [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2014-07-01

    GENFIT is an experiment-independent track-fitting toolkit, which combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation and alignment.

  20. Top ten accelerating cosmological models

    International Nuclear Information System (INIS)

    Szydlowski, Marek; Kurek, Aleksandra; Krawiec, Adam

    2006-01-01

    Recent astronomical observations indicate that the Universe is presently almost flat and undergoing a period of accelerated expansion. Basing on Einstein's general relativity all these observations can be explained by the hypothesis of a dark energy component in addition to cold dark matter (CDM). Because the nature of this dark energy is unknown, it was proposed some alternative scenario to explain the current accelerating Universe. The key point of this scenario is to modify the standard FRW equation instead of mysterious dark energy component. The standard approach to constrain model parameters, based on the likelihood method, gives a best-fit model and confidence ranges for those parameters. We always arbitrary choose the set of parameters which define a model which we compare with observational data. Because in the generic case, the introducing of new parameters improves a fit to the data set, there appears the problem of elimination of model parameters which can play an insufficient role. The Bayesian information criteria of model selection (BIC) is dedicated to promotion a set of parameters which should be incorporated to the model. We divide class of all accelerating cosmological models into two groups according to the two types of explanation acceleration of the Universe. Then the Bayesian framework of model selection is used to determine the set of parameters which gives preferred fit to the SNIa data. We find a few of flat cosmological models which can be recommend by the Bayes factor. We show that models with dark energy as a new fluid are favoured over models featuring a modified FRW equation

  1. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  2. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  3. Communities and Spontaneous Urban Planning: A Toolkit for Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    State-led urban planning is often absent, which creates unsustainable environments and hinders the integration of migrants. Communities' prospects of ... This toolkit is expected to be a viable alternative for planning urban expansion wherever it cannot be carried out through traditional means. The toolkit will be tested in ...

  4. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    Science.gov (United States)

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  5. Design-based learning in classrooms using playful digital toolkits

    NARCIS (Netherlands)

    Scheltenaar, K.J.; van der Poel, J.E.C.; Bekker, Tilde

    2015-01-01

    The goal of this paper is to explore how to implement Design Based Learning (DBL) with digital toolkits to teach 21st century skills in (Dutch) schools. It describes the outcomes of a literature study and two design case studies in which such a DBL approach with digital toolkits was iteratively

  6. The Lean and Environment Toolkit

    Science.gov (United States)

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  7. Innovations in oral health: A toolkit for interprofessional education.

    Science.gov (United States)

    Dolce, Maria C; Parker, Jessica L; Werrlein, Debra T

    2017-05-01

    The integration of oral health competencies into non-dental health professions curricula can serve as an effective driver for interprofessional education (IPE). The purpose of this report is to describe a replicable oral-health-driven IPE model and corresponding online toolkit, both of which were developed as part of the Innovations in Oral Health (IOH): Technology, Instruction, Practice, and Service programme at Bouvé College of Health Sciences, Northeastern University, USA. Tooth decay is a largely preventable disease that is connected to overall health and wellness, and it affects the majority of adults and a fifth of children in the United States. To prepare all health professionals to address this problem, the IOH model couples programming from the online resource Smiles for Life: A National Oral Health Curriculum with experiential learning opportunities designed for undergraduate and graduate students that include simulation-learning (technology), hands-on workshops and didactic sessions (instruction), and opportunities for both cooperative education (practice) and community-based learning (service). The IOH Toolkit provides the means for others to replicate portions of the IOH model or to establish a large-scale IPE initiative that will support the creation of an interprofessional workforce-one equipped with oral health competencies and ready for collaborative practice.

  8. Acceleration transforms and statistical kinetic models

    International Nuclear Information System (INIS)

    LuValle, M.J.; Welsher, T.L.; Svoboda, K.

    1988-01-01

    For a restricted class of problems a mathematical model of microscopic degradation processes, statistical kinetics, is developed and linked through acceleration transforms to the information which can be obtained from a system in which the only observable sign of degradation is sudden and catastrophic failure. The acceleration transforms were developed in accelerated life testing applications as a tool for extrapolating from the observable results of an accelerated life test to the dynamics of the underlying degradation processes. A particular concern of a physicist attempting to interpreted the results of an analysis based on acceleration transforms is determining the physical species involved in the degradation process. These species may be (a) relatively abundant or (b) relatively rare. The main results of this paper are a theorem showing that for an important subclass of statistical kinetic models, acceleration transforms cannot be used to distinguish between cases a and b, and an example showing that in some cases falling outside the restrictions of the theorem, cases a and b can be distinguished by their acceleration transforms

  9. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  10. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  11. Accelerator Modeling with MATLAB Accelerator Toolbox

    International Nuclear Information System (INIS)

    2002-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model storage rings and beam transport lines in the MATLAB environment. The objective is to illustrate the flexibility and efficiency of the AT-MATLAB framework. The paper discusses three examples of problems that are analyzed frequently in connection with ring-based synchrotron light sources

  12. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  13. Accelerator physics and modeling: Proceedings

    International Nuclear Information System (INIS)

    Parsa, Z.

    1991-01-01

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings

  14. chemf: A purely functional chemistry toolkit.

    Science.gov (United States)

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  15. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  16. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  17. Validation of Power Output for the WIND Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  18. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  19. Tips from the toolkit: 1 - know yourself.

    Science.gov (United States)

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  20. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  1. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  3. Sealed radioactive sources toolkit

    International Nuclear Information System (INIS)

    Mac Kenzie, C.

    2005-09-01

    The IAEA has developed a Sealed Radioactive Sources Toolkit to provide information to key groups about the safety and security of sealed radioactive sources. The key groups addressed are officials in government agencies, medical users, industrial users and the scrap metal industry. The general public may also benefit from an understanding of the fundamentals of radiation safety

  4. The ECVET toolkit customization for the nuclear energy sector

    Energy Technology Data Exchange (ETDEWEB)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von [European Commission, Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport

    2015-04-15

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  5. The ECVET toolkit customization for the nuclear energy sector

    International Nuclear Information System (INIS)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von

    2015-01-01

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  6. A New GPU-Enabled MODTRAN Thermal Model for the PLUME TRACKER Volcanic Emission Analysis Toolkit

    Science.gov (United States)

    Acharya, P. K.; Berk, A.; Guiang, C.; Kennett, R.; Perkins, T.; Realmuto, V. J.

    2013-12-01

    Real-time quantification of volcanic gaseous and particulate releases is important for (1) recognizing rapid increases in SO2 gaseous emissions which may signal an impending eruption; (2) characterizing ash clouds to enable safe and efficient commercial aviation; and (3) quantifying the impact of volcanic aerosols on climate forcing. The Jet Propulsion Laboratory (JPL) has developed state-of-the-art algorithms, embedded in their analyst-driven Plume Tracker toolkit, for performing SO2, NH3, and CH4 retrievals from remotely sensed multi-spectral Thermal InfraRed spectral imagery. While Plume Tracker provides accurate results, it typically requires extensive analyst time. A major bottleneck in this processing is the relatively slow but accurate FORTRAN-based MODTRAN atmospheric and plume radiance model, developed by Spectral Sciences, Inc. (SSI). To overcome this bottleneck, SSI in collaboration with JPL, is porting these slow thermal radiance algorithms onto massively parallel, relatively inexpensive and commercially-available GPUs. This paper discusses SSI's efforts to accelerate the MODTRAN thermal emission algorithms used by Plume Tracker. Specifically, we are developing a GPU implementation of the Curtis-Godson averaging and the Voigt in-band transmittances from near line center molecular absorption, which comprise the major computational bottleneck. The transmittance calculations were decomposed into separate functions, individually implemented as GPU kernels, and tested for accuracy and performance relative to the original CPU code. Speedup factors of 14 to 30× were realized for individual processing components on an NVIDIA GeForce GTX 295 graphics card with no loss of accuracy. Due to the separate host (CPU) and device (GPU) memory spaces, a redesign of the MODTRAN architecture was required to ensure efficient data transfer between host and device, and to facilitate high parallel throughput. Currently, we are incorporating the separate GPU kernels into a

  7. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  8. Terrain-Toolkit

    DEFF Research Database (Denmark)

    Wang, Qi; Kaul, Manohar; Long, Cheng

    2014-01-01

    , as will be shown, is used heavily for query processing in spatial databases; and (3) they do not provide the surface distance operator which is fundamental for many applications based on terrain data. Motivated by this, we developed a tool called Terrain-Toolkit for terrain data which accepts a comprehensive set......Terrain data is becoming increasingly popular both in industry and in academia. Many tools have been developed for visualizing terrain data. However, we find that (1) they usually accept very few data formats of terrain data only; (2) they do not support terrain simplification well which...

  9. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  10. A new approach to modeling linear accelerator systems

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators with specific applications to machines of interest to Accelerator Driven Transmutation Technologies (ADTT). The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in accessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are to be modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were used to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Code (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version of ASM is described and examples of the modeling and analysis capabilities are illustrated. The results of an example study, for an accelerator concept typical of ADTT applications, is presented and sample displays from the computer interface are shown

  11. Vibration acceleration promotes bone formation in rodent models.

    Directory of Open Access Journals (Sweden)

    Ryohei Uchida

    Full Text Available All living tissues and cells on Earth are subject to gravitational acceleration, but no reports have verified whether acceleration mode influences bone formation and healing. Therefore, this study was to compare the effects of two acceleration modes, vibration and constant (centrifugal accelerations, on bone formation and healing in the trunk using BMP 2-induced ectopic bone formation (EBF mouse model and a rib fracture healing (RFH rat model. Additionally, we tried to verify the difference in mechanism of effect on bone formation by accelerations between these two models. Three groups (low- and high-magnitude vibration and control-VA groups were evaluated in the vibration acceleration study, and two groups (centrifuge acceleration and control-CA groups were used in the constant acceleration study. In each model, the intervention was applied for ten minutes per day from three days after surgery for eleven days (EBF model or nine days (RFH model. All animals were sacrificed the day after the intervention ended. In the EBF model, ectopic bone was evaluated by macroscopic and histological observations, wet weight, radiography and microfocus computed tomography (micro-CT. In the RFH model, whole fracture-repaired ribs were excised with removal of soft tissue, and evaluated radiologically and histologically. Ectopic bones in the low-magnitude group (EBF model had significantly greater wet weight and were significantly larger (macroscopically and radiographically than those in the other two groups, whereas the size and wet weight of ectopic bones in the centrifuge acceleration group showed no significant difference compared those in control-CA group. All ectopic bones showed calcified trabeculae and maturated bone marrow. Micro-CT showed that bone volume (BV in the low-magnitude group of EBF model was significantly higher than those in the other two groups (3.1±1.2mm3 v.s. 1.8±1.2mm3 in high-magnitude group and 1.3±0.9mm3 in control-VA group, but

  12. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  13. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  14. Implementation of the Good School Toolkit in Uganda: a quantitative process evaluation of a successful violence prevention program.

    Science.gov (United States)

    Knight, Louise; Allen, Elizabeth; Mirembe, Angel; Nakuti, Janet; Namy, Sophie; Child, Jennifer C; Sturgess, Joanna; Kyegombe, Nambusi; Walakira, Eddy J; Elbourne, Diana; Naker, Dipak; Devries, Karen M

    2018-05-09

    The Good School Toolkit, a complex behavioural intervention designed by Raising Voices a Ugandan NGO, reduced past week physical violence from school staff to primary students by an average of 42% in a recent randomised controlled trial. This process evaluation quantitatively examines what was implemented across the twenty-one intervention schools, variations in school prevalence of violence after the intervention, factors that influence exposure to the intervention and factors associated with students' experience of physical violence from staff at study endline. Implementation measures were captured prospectively in the twenty-one intervention schools over four school terms from 2012 to 2014 and Toolkit exposure captured in the student (n = 1921) and staff (n = 286) endline cross-sectional surveys in 2014. Implementation measures and the prevalence of violence are summarised across schools and are assessed for correlation using Spearman's Rank Correlation Coefficient. Regression models are used to explore individual factors associated with Toolkit exposure and with physical violence at endline. School prevalence of past week physical violence from staff against students ranged from 7% to 65% across schools at endline. Schools with higher mean levels of teacher Toolkit exposure had larger decreases in violence during the study. Students in schools categorised as implementing a 'low' number of program school-led activities reported less exposure to the Toolkit. Higher student Toolkit exposure was associated with decreased odds of experiencing physical violence from staff (OR: 0.76, 95%CI: 0.67-0.86, p-valueEffectiveness of the Toolkit may be increased by further targeting and supporting teachers' engagement with girls and students with mental health difficulties. The trial is registered at clinicaltrials.gov , NCT01678846, August 24th 2012.

  15. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health ® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Field tests of a participatory ergonomics toolkit for Total Worker Health

    Science.gov (United States)

    Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2018-01-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and team-work skills of participants. PMID:28166897

  17. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  18. TIME-DEPENDENT STOCHASTIC ACCELERATION MODEL FOR FERMI BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Kento; Asano, Katsuaki; Terasawa, Toshio, E-mail: kentos@icrr.u-tokyo.ac.jp, E-mail: asanok@icrr.u-tokyo.ac.jp, E-mail: terasawa@icrr.u-tokyo.ac.jp [Institute for Cosmic Ray Research, The University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8582 (Japan)

    2015-12-01

    We study stochastic acceleration models for the Fermi bubbles. Turbulence is excited just behind the shock front via Kelvin–Helmholtz, Rayleigh–Taylor, or Richtmyer–Meshkov instabilities, and plasma particles are continuously accelerated by the interaction with the turbulence. The turbulence gradually decays as it goes away from the shock fronts. Adopting a phenomenological model for the stochastic acceleration, we explicitly solve the temporal evolution of the particle energy distribution in the turbulence. Our results show that the spatial distribution of high-energy particles is different from those for a steady solution. We also show that the contribution of electrons that escaped from the acceleration regions significantly softens the photon spectrum. The photon spectrum and surface brightness profile are reproduced by our models. If the escape efficiency is very high, the radio flux from the escaped low-energy electrons can be comparable to that of the WMAP haze. We also demonstrate hadronic models with the stochastic acceleration, but they are unlikely in the viewpoint of the energy budget.

  19. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    Science.gov (United States)

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  20. Application of the SHARP Toolkit to Sodium-Cooled Fast Reactor Challenge Problems

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Yu, Y. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Kim, T. K. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2017-09-30

    The Simulation-based High-efficiency Advanced Reactor Prototyping (SHARP) toolkit is under development by the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign of the U.S. Department of Energy, Office of Nuclear Energy. To better understand and exploit the benefits of advanced modeling simulations, the NEAMS Campaign initiated the “Sodium-Cooled Fast Reactor (SFR) Challenge Problems” task, which include the assessment of hot channel factors (HCFs) and the demonstration of zooming capability using the SHARP toolkit. If both challenge problems are resolved through advanced modeling and simulation using the SHARP toolkit, the economic competitiveness of a SFR can be significantly improved. The efforts in the first year of this project focused on the development of computational models, meshes, and coupling procedures for multi-physics calculations using the neutronics (PROTEUS) and thermal-hydraulic (Nek5000) components of the SHARP toolkit, as well as demonstration of the HCF calculation capability for the 100 MWe Advanced Fast Reactor (AFR-100) design. Testing the feasibility of the SHARP zooming capability is planned in FY 2018. The HCFs developed for the earlier SFRs (FFTF, CRBR, and EBR-II) were reviewed, and a subset of these were identified as potential candidates for reduction or elimination through high-fidelity simulations. A one-way offline coupling method was used to evaluate the HCFs where the neutronics solver PROTEUS computes the power profile based on an assumed temperature, and the computational fluid dynamics solver Nek5000 evaluates the peak temperatures using the neutronics power profile. If the initial temperature profile used in the neutronics calculation is reasonably accurate, the one-way offline method is valid because the neutronics power profile has weak dependence on small temperature variation. In order to get more precise results, the proper temperature profile for initial neutronics calculations was obtained from the

  1. Integrated on-line accelerator modeling at CEBAF

    International Nuclear Information System (INIS)

    Bowling, B.A.; Shoaee, H.; Van Zeijts, J.; Witherspoon, S.; Watson, W.

    1995-01-01

    An on-line accelerator modeling facility is currently under development at CEBAF. The model server, which is integrated with the EPICS control system, provides coupled and 2nd-order matrices for the entire accelerator, and forms the foundation for automated model- based control and diagnostic applications. Four types of machine models are provided, including design, golden or certified, live, and scratch or simulated model. Provisions are also made for the use of multiple lattice modeling programs such as DIMAD, PARMELA, and TLIE. Design and implementation details are discussed. 2 refs., 4 figs

  2. The Populist Toolkit

    OpenAIRE

    Ylä-Anttila, Tuukka Salu Santeri

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  3. An interactive toolkit to extract phenological time series data from digital repeat photography

    Science.gov (United States)

    Seyednasrollah, B.; Milliman, T. E.; Hufkens, K.; Kosmala, M.; Richardson, A. D.

    2017-12-01

    Near-surface remote sensing and in situ photography are powerful tools to study how climate change and climate variability influence vegetation phenology and the associated seasonal rhythms of green-up and senescence. The rapidly-growing PhenoCam network has been using in situ digital repeat photography to study phenology in almost 500 locations around the world, with an emphasis on North America. However, extracting time series data from multiple years of half-hourly imagery - while each set of images may contain several regions of interest (ROI's), corresponding to different species or vegetation types - is not always straightforward. Large volumes of data require substantial processing time, and changes (either intentional or accidental) in camera field of view requires adjustment of ROI masks. Here, we introduce and present "DrawROI" as an interactive web-based application for imagery from PhenoCam. DrawROI can also be used offline, as a fully independent toolkit that significantly facilitates extraction of phenological data from any stack of digital repeat photography images. DrawROI provides a responsive environment for phenological scientists to interactively a) delineate ROIs, b) handle field of view (FOV) shifts, and c) extract and export time series data characterizing image color (i.e. red, green and blue channel digital numbers for the defined ROI). The application utilizes artificial intelligence and advanced machine learning techniques and gives user the opportunity to redraw new ROIs every time an FOV shift occurs. DrawROI also offers a quality control flag to indicate noisy data and images with low quality due to presence of foggy weather or snow conditions. The web-based application significantly accelerates the process of creating new ROIs and modifying pre-existing ROI in the PhenoCam database. The offline toolkit is presented as an open source R-package that can be used with similar datasets with time-lapse photography to obtain more data for

  4. Accelerator Operators and Software Development

    International Nuclear Information System (INIS)

    April Miller; Michele Joyce

    2001-01-01

    At Thomas Jefferson National Accelerator Facility, accelerator operators perform tasks in their areas of specialization in addition to their machine operations duties. One crucial area in which operators contribute is software development. Operators with programming skills are uniquely qualified to develop certain controls applications because of their expertise in the day-to-day operation of the accelerator. Jefferson Lab is one of the few laboratories that utilizes the skills and knowledge of operators to create software that enhances machine operations. Through the programs written; by operators, Jefferson Lab has improved machine efficiency and beam availability. Because many of these applications involve automation of procedures and need graphical user interfaces, the scripting language Tcl and the Tk toolkit have been adopted. In addition to automation, some operator-developed applications are used for information distribution. For this purpose, several standard web development tools such as perl, VBScript, and ASP are used. Examples of applications written by operators include injector steering, spin angle changes, system status reports, magnet cycling routines, and quantum efficiency measurements. This paper summarizes how the unique knowledge of accelerator operators has contributed to the success of the Jefferson Lab control system. *This work was supported by the U.S. DOE contract No. DE-AC05-84-ER40150

  5. Advancements in Wind Integration Study Data Modeling: The Wind Integration National Dataset (WIND) Toolkit; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C.; Hodge, B. M.; Orwig, K.; Jones, W.; Searight, K.; Getman, D.; Harrold, S.; McCaa, J.; Cline, J.; Clark, C.

    2013-10-01

    Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather prediction model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.

  6. A toolkit for promoting healthy ageing

    NARCIS (Netherlands)

    Jeroen Knevel; Aly Gruppen

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience,

  7. Fragment Impact Toolkit (FIT)

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel Wolf [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Key, Brian P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garcia, Daniel B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  8. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.

  9. Web-based Toolkit for Dynamic Generation of Data Processors

    Science.gov (United States)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  10. User's manual for the two-dimensional transputer graphics toolkit

    Science.gov (United States)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  11. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  12. Can the Stephani model be an alternative to FRW accelerating models?

    International Nuclear Information System (INIS)

    Godlowski, Wlodzimierz; Stelmach, Jerzy; Szydlowski, Marek

    2004-01-01

    A class of Stephani cosmological models as a prototype of a non-homogeneous universe is considered. The non-homogeneity can lead to accelerated evolution, which is now observed from the SNe Ia data. Three samples of type Ia supernovae obtained by Perlmutter et al, Tonry et al and Knop et al are taken into account. Different statistical methods (best fits as well as maximum likelihood method) to obtain estimation for the model parameters are used. The Stephani model is considered as an alternative to the ΛCDM model in the explanation of the present acceleration of the universe. The model explains the acceleration of the universe at the same level of accuracy as the ΛCDM model (χ 2 statistics are comparable). From the best fit analysis it follows that the Stephani model is characterized by a higher value of density parameter Ω m0 than the ΛCDM model. It is also shown that the model is consistent with the location of CMB peaks

  13. RAMI analysis and modeling for the LANSCE accelerator systems

    Energy Technology Data Exchange (ETDEWEB)

    Macek, R.J.; Wilkinson, C.A. [Los Alamos National Laboratory, NM (United States)

    1995-10-01

    Reliability, availability, maintainability, and inspectability (RAMI) have become important issues for the high-power machines being planned for applications such as accelerator transmutation of nuclear waste (ATW), accelerator production of tritium (APT) and the next generation spallation neutron source. Beam reliability and beam availability are vitally important specifications to the present users of accelerator-driven spallation neutron sources, synchrotron light sources and medical accelerators. At Los Alamos, improved beam availability is a key goal in the planned LANSCE improvement program. Clearly, the capability to adequately model and predict the reliability and availability of complex accelerator systems will be of great value in assessing and optimizing RAMI measures in accelerator design and improvement programs. To date, no major accelerator project has developed comprehensive reliability models although the Advance Photon Source at ANL has started work on reliability analysis for selected subsystems. In this paper the authors discuss their experience in developing RAMI analysis and modeling for the LANSCE Accelerator Systems. Progress has been made in developing suitable measures and functions to characterize user risk, in logging of needed data on failure rates and repair/down times, and in developing a first-pass RAMI model for selected subsystems. Plans have been made for a more complete RAMI model. In addition, the authors discuss their experience in the use of probabilistic risk assessment (PRA) methodology for estimation of the reliability of active, instrumentation-based, radiation safety systems at LANSCE.

  14. Modeling of Particle Acceleration at Multiple Shocks Via Diffusive Shock Acceleration: Preliminary Results

    Science.gov (United States)

    Parker, L. N.; Zank, G. P.

    2013-12-01

    Successful forecasting of energetic particle events in space weather models require algorithms for correctly predicting the spectrum of ions accelerated from a background population of charged particles. We present preliminary results from a model that diffusively accelerates particles at multiple shocks. Our basic approach is related to box models (Protheroe and Stanev, 1998; Moraal and Axford, 1983; Ball and Kirk, 1992; Drury et al., 1999) in which a distribution of particles is diffusively accelerated inside the box while simultaneously experiencing decompression through adiabatic expansion and losses from the convection and diffusion of particles outside the box (Melrose and Pope, 1993; Zank et al., 2000). We adiabatically decompress the accelerated particle distribution between each shock by either the method explored in Melrose and Pope (1993) and Pope and Melrose (1994) or by the approach set forth in Zank et al. (2000) where we solve the transport equation by a method analogous to operator splitting. The second method incorporates the additional loss terms of convection and diffusion and allows for the use of a variable time between shocks. We use a maximum injection energy (Emax) appropriate for quasi-parallel and quasi-perpendicular shocks (Zank et al., 2000, 2006; Dosch and Shalchi, 2010) and provide a preliminary application of the diffusive acceleration of particles by multiple shocks with frequencies appropriate for solar maximum (i.e., a non-Markovian process).

  15. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM).

  16. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  17. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  18. Texas Team: Academic Progression and IOM Toolkit.

    Science.gov (United States)

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  19. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  20. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  1. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  2. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  3. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  4. Development of a Multimedia Toolkit for Engineering Graphics Education

    Directory of Open Access Journals (Sweden)

    Moudar Zgoul

    2009-09-01

    Full Text Available This paper focuses upon the development of a multimedia toolkit to support the teaching of Engineering Graphics Course. The project used different elements for the toolkit; animations, videos and presentations which were then integrated in a dedicated internet website. The purpose of using these elements is to assist the students building and practicing the needed engineering skills at their own pace as a part of an e-Learning solution. Furthermore, this kit allows students to repeat and view the processes and techniques of graphical construction, and visualization as much as needed, allowing them to follow and practice on their own.

  5. A toolkit for computerized operating procedure of complex industrial systems with IVI-COM technology

    International Nuclear Information System (INIS)

    Zhou Yangping; Dong Yujie; Huang Xiaojing; Ye Jingliang; Yoshikawa, Hidekazu

    2013-01-01

    A human interface toolkit is proposed to help the user develop computerized operating procedure of complex industrial system such as Nuclear Power Plants (NPPs). Coupled with a friendly graphical interface, this integrated tool includes a database, a procedure editor and a procedure executor. A three layer hierarchy is adopted to express the complexity of operating procedure, which includes mission, process and node. There are 10 kinds of node: entrance, exit, hint, manual input, detector, actuator, data treatment, branch, judgment and plug-in. The computerized operating procedure will sense and actuate the actual industrial systems with the interface based on IVI-COM (Interchangeable Virtual Instrumentation-Component Object Model) technology. A prototype system of this human interface toolkit has been applied to develop a simple computerized operating procedure for a simulated NPP. (author)

  6. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  7. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  8. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  9. How to create an interface between UrQMD and Geant4 toolkit

    CERN Document Server

    Abdel-Waged, Khaled; Uzhinskii, V.V.

    2012-01-01

    An interface between the UrQMD-1.3cr model (version 1.3 for cosmic air showers) and the Geant4 transport toolkit has been developed. Compared to the current Geant4 (hybrid) hadronic models, this provides the ability to simulate at the microscopic level hadron, nucleus, and anti-nucleus interactions with matter from 0 to 1 TeV with a single transport code. This document provides installation requirements and instructions, as well as class and member function descriptions of the software.

  10. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  11. Model-based accelerator controls: What, why and how

    International Nuclear Information System (INIS)

    Sidhu, S.S.

    1987-01-01

    Model-based control is defined as a gamut of techniques whose aim is to improve the reliability of an accelerator and enhance the capabilities of the operator, and therefore of the whole control system. The aim of model-based control is seen as gradually moving the function of model-reference from the operator to the computer. The role of the operator in accelerator control and the need for and application of model-based control are briefly summarized

  12. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    Science.gov (United States)

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  13. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL

    Directory of Open Access Journals (Sweden)

    Walsh Carolyn O

    2010-03-01

    Full Text Available Abstract Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672

  14. BAT - The Bayesian Analysis Toolkit

    CERN Document Server

    Caldwell, Allen C; Kröninger, Kevin

    2009-01-01

    We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner. A goodness-of-fit criterion is presented which is intuitive and of great practical use.

  15. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  16. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Science.gov (United States)

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  17. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  18. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  19. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  20. Heart Failure: Self-care to Success: Development and evaluation of a program toolkit.

    Science.gov (United States)

    Bryant, Rebecca

    2017-08-17

    The Heart Failure: Self-care to Success toolkit was developed to assist NPs in empowering patients with heart failure (HF) to improve individual self-care behaviors. This article details the evolution of this toolkit for NPs, its effectiveness with patients with HF, and recommendations for future research and dissemination strategies.

  1. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  2. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  3. Innovations and Challenges of Implementing a Glucose Gel Toolkit for Neonatal Hypoglycemia.

    Science.gov (United States)

    Hammer, Denise; Pohl, Carla; Jacobs, Peggy J; Kaufman, Susan; Drury, Brenda

    2018-05-24

    Transient neonatal hypoglycemia occurs most commonly in newborns who are small for gestational age, large for gestational age, infants of diabetic mothers, and late preterm infants. An exact blood glucose value has not been determined for neonatal hypoglycemia, and it is important to note that poor neurologic outcomes can occur if hypoglycemia is left untreated. Interventions that separate mothers and newborns, as well as use of formula to treat hypoglycemia, have the potential to disrupt exclusive breastfeeding. To determine whether implementation of a toolkit designed to support staff in the adaptation of the practice change for management of newborns at risk for hypoglycemia, that includes 40% glucose gel in an obstetric unit with a level 2 nursery will decrease admissions to the Intermediate Care Nursery, and increase exclusive breastfeeding. This descriptive study used a retrospective chart review for pre/postimplementation of the Management of Newborns at Risk for Hypoglycemia Toolkit (Toolkit) using a convenience sample of at-risk newborns in the first 2 days of life to evaluate the proposed outcomes. Following implementation of the Toolkit, at-risk newborns had a clinically but not statistically significant 6.5% increase in exclusive breastfeeding and a clinically but not statistically significant 5% decrease in admissions to the Intermediate Care Nursery. The Toolkit was designed for ease of staff use and to improve outcomes for the at-risk newborn. Future research includes replication at other level 2 and level 1 obstetric centers and investigation into the number of 40% glucose gel doses that can safely be administered.

  4. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pshenichnov, Igor, E-mail: pshenich@fias.uni-frankfurt.d [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Botvina, Alexander [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Mishustin, Igor [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Kurchatov Institute, Russian Research Center, 123182 Moscow (Russian Federation); Greiner, Walter [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany)

    2010-03-15

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  5. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    International Nuclear Information System (INIS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-01-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  6. Mini-grid Policy Tool-kit. Policy and business frameworks for successful mini-grid roll-outs

    International Nuclear Information System (INIS)

    Franz, Michael; Hayek, Niklas; Peterschmidt, Nico; Rohrer, Michael; Kondev, Bozhil; Adib, Rana; Cader, Catherina; Carter, Andrew; George, Peter; Gichungi, Henry; Hankins, Mark; Kappiah, Mahama; Mangwengwende, Simbarashe E.

    2014-01-01

    The Mini-grid Policy Tool-kit is for policy makers to navigate the mini-grid policy design process. It contains information on mini-grid operator models, the economics of mini-grids, and necessary policy and regulation that must be considered for successful implementation. The publication specifically focuses on Africa. Progress on extending the electricity grid in many countries has remained slow because of high costs of gird-extension and limited utility/state budgets for electrification. Mini-grids provide an affordable and cost-effective option to extend needed electricity services. Putting in place the right policy for min-grid deployment requires considerable effort but can yield significant improvement in electricity access rates as examples from Kenya, Senegal and Tanzania illustrate. The tool-kit is available in English, French and Portuguese

  7. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  8. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    Science.gov (United States)

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  9. Parametric study of emerging high power accelerator applications using Accelerator Systems Model (ASM)

    International Nuclear Information System (INIS)

    Berwald, D.H.; Mendelsohn, S.S.; Myers, T.J.; Paulson, C.C.; Peacock, M.A.; Piaszczyk, CM.; Rathke, J.W.; Piechowiak, E.M.

    1996-01-01

    Emerging applications for high power rf linacs include fusion materials testing, generation of intense spallation neutrons for neutron physics and materials studies, production of nuclear materials and destruction of nuclear waste. Each requires the selection of an optimal configuration and operating parameters for its accelerator, rf power system and other supporting subsystems. Because of the high cost associated with these facilities, economic considerations become paramount, dictating a full evaluation of the electrical and rf performance, system reliability/availability, and capital, operating, and life cycle costs. The Accelerator Systems Model (ASM), expanded and modified by Northrop Grumman during 1993-96, provides a unique capability for detailed layout and evaluation of a wide variety of normal and superconducting accelerator and rf power configurations. This paper will discuss the current capabilities of ASM, including the available models and data base, and types of trade studies that can be performed for the above applications. (author)

  10. Transient accelerating scalar models with exponential potentials

    International Nuclear Information System (INIS)

    Cui Wen-Ping; Zhang Yang; Fu Zheng-Wen

    2013-01-01

    We study a known class of scalar dark energy models in which the potential has an exponential term and the current accelerating era is transient. We find that, although a decelerating era will return in the future, when extrapolating the model back to earlier stages (z ≳ 4), scalar dark energy becomes dominant over matter. So these models do not have the desired tracking behavior, and the predicted transient period of acceleration cannot be adopted into the standard scenario of the Big Bang cosmology. When couplings between the scalar field and matter are introduced, the models still have the same problem; only the time when deceleration returns will be varied. To achieve re-deceleration, one has to turn to alternative models that are consistent with the standard Big Bang scenario.

  11. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    Science.gov (United States)

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…

  12. Monitoring the grid with the Globus Toolkit MDS4

    International Nuclear Information System (INIS)

    Schopf, Jennifer M; Pearlman, Laura; Miller, Neill; Kesselman, Carl; Foster, Ian; D'Arcy, Mike; Chervenak, Ann

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms

  13. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  14. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  15. Midwives in medical student and resident education and the development of the medical education caucus toolkit.

    Science.gov (United States)

    Radoff, Kari; Nacht, Amy; Natch, Amy; McConaughey, Edie; Salstrom, Jan; Schelling, Karen; Seger, Suzanne

    2015-01-01

    Midwives have been involved formally and informally in the training of medical students and residents for many years. Recent reductions in resident work hours, emphasis on collaborative practice, and a focus on midwives as key members of the maternity care model have increased the involvement of midwives in medical education. Midwives work in academic settings as educators to teach the midwifery model of care, collaboration, teamwork, and professionalism to medical students and residents. In 2009, members of the American College of Nurse-Midwives formed the Medical Education Caucus (MECA) to discuss the needs of midwives teaching medical students and residents; the group has held a workshop annually over the last 4 years. In 2014, MECA workshop facilitators developed a toolkit to support and formalize the role of midwives involved in medical student and resident education. The MECA toolkit provides a roadmap for midwives beginning involvement and continuing or expanding the role of midwives in medical education. This article describes the history of midwives in medical education, the development and growth of MECA, and the resulting toolkit created to support and formalize the role of midwives as educators in medical student and resident education, as well as common challenges for the midwife in academic medicine. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. © 2015 by the American College of Nurse-Midwives.

  16. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    Science.gov (United States)

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  18. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  19. Online modeling of the Fermilab accelerators

    International Nuclear Information System (INIS)

    McCrory, Elliott S.; Michelotti, Leo; Ostiguy, Jean-Francois

    2001-01-01

    We have implemented access to beam physics models of the Fermilab accelerators and beamlines through the Fermilab control system. The models run on Unix workstations, communicating with legacy controls software through a front end redirection mechanism (the open access server), a relational database and a simple text-based protocol over TCP/IP. The clients and the server are implemented in object-oriented C++. We discuss limitations of our approach and the difficulties that arise from it. Some of the obstacles may be overcome by introducing a new layer of abstraction. To maintain compatibility with the next generation of accelerator control software currently under development at the laboratory, this layer would be implemented in Java. We discuss the implications of that choice

  20. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  1. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  2. Modeling of thermalization phenomena in coaxial plasma accelerators

    Science.gov (United States)

    Subramaniam, Vivek; Panneerchelvam, Premkumar; Raja, Laxminarayan L.

    2018-05-01

    Coaxial plasma accelerators are electromagnetic acceleration devices that employ a self-induced Lorentz force to produce collimated plasma jets with velocities ~50 km s‑1. The accelerator operation is characterized by the formation of an ionization/thermalization zone near gas inlet of the device that continually processes the incoming neutral gas into a highly ionized thermal plasma. In this paper, we present a 1D non-equilibrium plasma model to resolve the plasma formation and the electron-heavy species thermalization phenomena that take place in the thermalization zone. The non-equilibrium model is based on a self-consistent multi-species continuum description of the plasma with finite-rate chemistry. The thermalization zone is modelled by tracking a 1D gas-bit as it convects down the device with an initial gas pressure of 1 atm. The thermalization process occurs in two stages. The first is a plasma production stage, associated with a rapid increase in the charged species number densities facilitated by cathode surface electron emission and volumetric production processes. The production stage results in the formation of a two-temperature plasma with electron energies of ~2.5 eV in a low temperature background gas of ~300 K. The second, a temperature equilibration stage, is characterized by the energy transfer between the electrons and heavy species. The characteristic length scale for thermalization is found to be comparable to axial length of the accelerator thus putting into question the equilibrium magnetohydrodynamics assumption used in modeling coaxial accelerators.

  3. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  4. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  5. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2012-01-01

    Full Text Available Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n=6 physicians defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n=29 health professionals revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF and the positive and negative components of the multilevel person-centered integrative diagnosis model.

  6. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  7. UniSchooLabs Toolkit: Tools and Methodologies to Support the Adoption of Universities’ Remote and Virtual Labs in Schools

    Directory of Open Access Journals (Sweden)

    Augusto Chioccariello

    2012-11-01

    Full Text Available The UniSchooLabs project aims at creating an infrastructure supporting web access to remote/virtual labs and associated educational resources to engage learners with hands-on and minds-on activities in science, technology and math in schools. The UniSchooLabs tool-kit supports the teacher in selecting a remote or virtual lab and developing a lab activity based on an inquiry model template. While working with the toolkit the teacher has access to three main features: a a catalogue of available online laboratories; b an archive of activities created by other users; c a tool for creating new activities or reusing existing ones.

  8. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  9. A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities

    Energy Technology Data Exchange (ETDEWEB)

    Todd, A.M.M.; Paulson, C.C.; Peacock, M.A. [Grumman Research and Development Center, Princeton, NJ (United States)] [and others

    1995-10-01

    A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G.H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. A decision has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.

  10. A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities

    International Nuclear Information System (INIS)

    Todd, Alan M. M.; Paulson, C. C.; Peacock, M. A.; Reusch, M. F.

    1995-01-01

    A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G. H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. A decision has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities

  11. Peer support for families of children with complex needs: Development and dissemination of a best practice toolkit.

    Science.gov (United States)

    Schippke, J; Provvidenza, C; Kingsnorth, S

    2017-11-01

    Benefits of peer support interventions for families of children with disabilities and complex medical needs have been described in the literature. An opportunity to create an evidence-informed resource to synthesize best practices in peer support for program providers was identified. The objective of this paper is to describe the key activities used to develop and disseminate the Peer Support Best Practice Toolkit. This project was led by a team of knowledge translation experts at a large pediatric rehabilitation hospital using a knowledge exchange framework. An integrated knowledge translation approach was used to engage stakeholders in the development process through focus groups and a working group. To capture best practices in peer support, a rapid evidence review and review of related resources were completed. Case studies were also included to showcase practice-based evidence. The toolkit is freely available online for download and is structured into four sections: (a) background and models of peer support, (b) case studies of programs, (c) resources, and (d) rapid evidence review. A communications plan was developed to disseminate the resource and generate awareness through presentations, social media, and champion engagement. Eight months postlaunch, the peer support website received more than 2,400 webpage hits. Early indicators suggest high relevance of this resource among stakeholders. The toolkit format was valuable to synthesize and share best practices in peer support. Strengths of the work include the integrated approach used to develop the toolkit and the inclusion of both the published research literature and experiential evidence. © 2017 John Wiley & Sons Ltd.

  12. Guest editors' introduction to the 4th issue of Experimental Software and Toolkits (EST-4)

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Kienle, H.M.; Mens, K.

    2014-01-01

    Experimental software and toolkits play a crucial role in computer science. Elsevier’s Science of Computer Programming special issues on Experimental Software and Toolkits (EST) provide a means for academic tool builders to get more visibility and credit for their work, by publishing a paper along

  13. Numerical relativity in spherical coordinates with the Einstein Toolkit

    Science.gov (United States)

    Mewes, Vassilios; Zlochower, Yosef; Campanelli, Manuela; Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.

    2018-04-01

    Numerical relativity codes that do not make assumptions on spatial symmetries most commonly adopt Cartesian coordinates. While these coordinates have many attractive features, spherical coordinates are much better suited to take advantage of approximate symmetries in a number of astrophysical objects, including single stars, black holes, and accretion disks. While the appearance of coordinate singularities often spoils numerical relativity simulations in spherical coordinates, especially in the absence of any symmetry assumptions, it has recently been demonstrated that these problems can be avoided if the coordinate singularities are handled analytically. This is possible with the help of a reference-metric version of the Baumgarte-Shapiro-Shibata-Nakamura formulation together with a proper rescaling of tensorial quantities. In this paper we report on an implementation of this formalism in the Einstein Toolkit. We adapt the Einstein Toolkit infrastructure, originally designed for Cartesian coordinates, to handle spherical coordinates, by providing appropriate boundary conditions at both inner and outer boundaries. We perform numerical simulations for a disturbed Kerr black hole, extract the gravitational wave signal, and demonstrate that the noise in these signals is orders of magnitude smaller when computed on spherical grids rather than Cartesian grids. With the public release of our new Einstein Toolkit thorns, our methods for numerical relativity in spherical coordinates will become available to the entire numerical relativity community.

  14. Accelerating transition dynamics in city regions: A qualitative modeling perspective

    NARCIS (Netherlands)

    P.J. Valkering (Pieter); Yücel, G. (Gönenç); Gebetsroither-Geringer, E. (Ernst); Markvica, K. (Karin); Meynaerts, E. (Erika); N. Frantzeskaki (Niki)

    2017-01-01

    textabstractIn this article, we take stock of the findings from conceptual and empirical work on the role of transition initiatives for accelerating transitions as input for modeling acceleration dynamics. We applied the qualitative modeling approach of causal loop diagrams to capture the dynamics

  15. A toolkit for promoting healthy ageing

    OpenAIRE

    Knevel, Jeroen; Gruppen, Aly

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience, preventing loneliness and social participation. Besides some concise background information, we offer you a great diversity of exercises per theme which can help you discuss, assess, change or strengt...

  16. Gpufit: An open-source toolkit for GPU-accelerated curve fitting.

    Science.gov (United States)

    Przybylski, Adrian; Thiel, Björn; Keller-Findeisen, Jan; Stock, Bernd; Bates, Mark

    2017-11-16

    We present a general purpose, open-source software library for estimation of non-linear parameters by the Levenberg-Marquardt algorithm. The software, Gpufit, runs on a Graphics Processing Unit (GPU) and executes computations in parallel, resulting in a significant gain in performance. We measured a speed increase of up to 42 times when comparing Gpufit with an identical CPU-based algorithm, with no loss of precision or accuracy. Gpufit is designed such that it is easily incorporated into existing applications or adapted for new ones. Multiple software interfaces, including to C, Python, and Matlab, ensure that Gpufit is accessible from most programming environments. The full source code is published as an open source software repository, making its function transparent to the user and facilitating future improvements and extensions. As a demonstration, we used Gpufit to accelerate an existing scientific image analysis package, yielding significantly improved processing times for super-resolution fluorescence microscopy datasets.

  17. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study.

    Science.gov (United States)

    Lee, Lisa; Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-10-01

    To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Questionnaire-based survey of attendees at a national ePrescribing symposium. 2013 National ePrescribing Symposium in London, UK. Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals' experiences (n = 45; 64.3%) were considered the most useful types of content. There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning.

  18. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... relevant to reducing air pollution from oil and natural gas production and processing. The Department of... environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to.... technologies. The Toolkit will support the President's National Export Initiative by fostering export...

  19. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Chung, D.Y.

    1999-01-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45

  20. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  1. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  2. Physical models implemented in the Geant4-DNA extension of the Geant-4 toolkit for calculating initial radiation damage at the molecular level

    International Nuclear Information System (INIS)

    Villagrasa, C.; Francis, Z.; Incerti, S.

    2011-01-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γH2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nano-metric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nano-metric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV -1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water. (authors)

  3. The Best Ever Alarm System Toolkit

    International Nuclear Information System (INIS)

    Kasemir, Kay; Chen, Xihui; Danilova, Ekaterina N.

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good 'alarm philosophy' on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  4. A Genetic Toolkit for Dissecting Dopamine Circuit Function in Drosophila

    Directory of Open Access Journals (Sweden)

    Tingting Xie

    2018-04-01

    Full Text Available Summary: The neuromodulator dopamine (DA plays a key role in motor control, motivated behaviors, and higher-order cognitive processes. Dissecting how these DA neural networks tune the activity of local neural circuits to regulate behavior requires tools for manipulating small groups of DA neurons. To address this need, we assembled a genetic toolkit that allows for an exquisite level of control over the DA neural network in Drosophila. To further refine targeting of specific DA neurons, we also created reagents that allow for the conversion of any existing GAL4 line into Split GAL4 or GAL80 lines. We demonstrated how this toolkit can be used with recently developed computational methods to rapidly generate additional reagents for manipulating small subsets or individual DA neurons. Finally, we used the toolkit to reveal a dynamic interaction between a small subset of DA neurons and rearing conditions in a social space behavioral assay. : The rapid analysis of how dopaminergic circuits regulate behavior is limited by the genetic tools available to target and manipulate small numbers of these neurons. Xie et al. present genetic tools in Drosophila that allow rational targeting of sparse dopaminergic neuronal subsets and selective knockdown of dopamine signaling. Keywords: dopamine, genetics, behavior, neural circuits, neuromodulation, Drosophila

  5. Modeling high-power RF accelerator cavities with SPICE

    International Nuclear Information System (INIS)

    Humphries, S. Jr.

    1992-01-01

    The dynamical interactions between RF accelerator cavities and high-power beams can be treated on personal computers using a lumped circuit element model and the SPICE circuit analysis code. Applications include studies of wake potentials, two-beam accelerators, microwave sources, and transverse mode damping. This report describes the construction of analogs for TM mn0 modes and the creation of SPICE input for cylindrical cavities. The models were used to study continuous generation of kA electron beam pulses from a vacuum cavity driven by a high-power RF source

  6. BioPig: a Hadoop-based analytic toolkit for large-scale sequence data.

    Science.gov (United States)

    Nordberg, Henrik; Bhatia, Karan; Wang, Kai; Wang, Zhong

    2013-12-01

    The recent revolution in sequencing technologies has led to an exponential growth of sequence data. As a result, most of the current bioinformatics tools become obsolete as they fail to scale with data. To tackle this 'data deluge', here we introduce the BioPig sequence analysis toolkit as one of the solutions that scale to data and computation. We built BioPig on the Apache's Hadoop MapReduce system and the Pig data flow language. Compared with traditional serial and MPI-based algorithms, BioPig has three major advantages: first, BioPig's programmability greatly reduces development time for parallel bioinformatics applications; second, testing BioPig with up to 500 Gb sequences demonstrates that it scales automatically with size of data; and finally, BioPig can be ported without modification on many Hadoop infrastructures, as tested with Magellan system at National Energy Research Scientific Computing Center and the Amazon Elastic Compute Cloud. In summary, BioPig represents a novel program framework with the potential to greatly accelerate data-intensive bioinformatics analysis.

  7. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  8. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  9. A cosmology forecast toolkit — CosmoLib

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhiqi, E-mail: zqhuang@cita.utoronto.ca [CEA, Institut de Physique Théorique, Orme des Merisiers, Saint-Aubin, 91191 Gif-sur-Yvette Cédex (France)

    2012-06-01

    The package CosmoLib is a combination of a cosmological Boltzmann code and a simulation toolkit to forecast the constraints on cosmological parameters from future observations. In this paper we describe the released linear-order part of the package. We discuss the stability and performance of the Boltzmann code. This is written in Newtonian gauge and including dark energy perturbations. In CosmoLib the integrator that computes the CMB angular power spectrum is optimized for a l-by-l brute-force integration, which is useful for studying inflationary models predicting sharp features in the primordial power spectrum of metric fluctuations. As an application, CosmoLib is used to study the axion monodromy inflation model that predicts cosine oscillations in the primordial power spectrum. In contrast to the previous studies by Aich et al. and Meerburg et al., we found no detection or hint of the osicllations. We pointed out that the CAMB code modified by Aich et al. does not have sufficient numerical accuracy. CosmoLib and its documentation are available at http://www.cita.utoronto.ca/∼zqhuang/CosmoLib.

  10. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  11. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  12. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    Science.gov (United States)

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; P<.001). Of the 15 teaching skills assessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  13. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    Directory of Open Access Journals (Sweden)

    Jon Smart

    2018-02-01

    Full Text Available Introduction: Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods: As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Results: Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Conclusion: Residents from across the world collaborated and convened to reach a consensus on high-yield—and potentially high-impact—lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  14. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit.

    Science.gov (United States)

    Chung, Arlene S; Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-03-01

    Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Residents from across the world collaborated and convened to reach a consensus on high-yield-and potentially high-impact-lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  15. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  16. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  17. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    Science.gov (United States)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  18. SlideToolkit: an assistive toolset for the histological quantification of whole slide images.

    Directory of Open Access Journals (Sweden)

    Bastiaan G L Nelissen

    Full Text Available The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition, whole slide images are collected and converted to TIFF files. In the second step (preparation, files are organized. The third step (tiles, creates multiple manageable tiles to count. In the fourth step (analysis, tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets.

  19. Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation

    Science.gov (United States)

    Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong

    2017-05-01

    Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.

  20. The Nuclear Energy Advanced Modeling and Simulation Safeguards and Separations Reprocessing Plant Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    McCaskey, Alex [ORNL; Billings, Jay Jay [ORNL; de Almeida, Valmor F [ORNL

    2011-08-01

    This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usability and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.

  1. Modeling accelerator structures and RF components

    International Nuclear Information System (INIS)

    Ko, K., Ng, C.K.; Herrmannsfeldt, W.B.

    1993-03-01

    Computer modeling has become an integral part of the design and analysis of accelerator structures RF components. Sophisticated 3D codes, powerful workstations and timely theory support all contributed to this development. We will describe our modeling experience with these resources and discuss their impact on ongoing work at SLAC. Specific examples from R ampersand D on a future linear collide and a proposed e + e - storage ring will be included

  2. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  3. ‘Survival’: a simulation toolkit introducing a modular approach for radiobiological evaluations in ion beam therapy

    Science.gov (United States)

    Manganaro, L.; Russo, G.; Bourhaleb, F.; Fausti, F.; Giordanengo, S.; Monaco, V.; Sacchi, R.; Vignati, A.; Cirio, R.; Attili, A.

    2018-04-01

    One major rationale for the application of heavy ion beams in tumour therapy is their increased relative biological effectiveness (RBE). The complex dependencies of the RBE on dose, biological endpoint, position in the field etc require the use of biophysical models in treatment planning and clinical analysis. This study aims to introduce a new software, named ‘Survival’, to facilitate the radiobiological computations needed in ion therapy. The simulation toolkit was written in C++ and it was developed with a modular architecture in order to easily incorporate different radiobiological models. The following models were successfully implemented: the local effect model (LEM, version I, II and III) and variants of the microdosimetric-kinetic model (MKM). Different numerical evaluation approaches were also implemented: Monte Carlo (MC) numerical methods and a set of faster analytical approximations. Among the possible applications, the toolkit was used to reproduce the RBE versus LET for different ions (proton, He, C, O, Ne) and different cell lines (CHO, HSG). Intercomparison between different models (LEM and MKM) and computational approaches (MC and fast approximations) were performed. The developed software could represent an important tool for the evaluation of the biological effectiveness of charged particles in ion beam therapy, in particular when coupled with treatment simulations. Its modular architecture facilitates benchmarking and inter-comparison between different models and evaluation approaches. The code is open source (GPL2 license) and available at https://github.com/batuff/Survival.

  4. Neural Networks for Modeling and Control of Particle Accelerators

    Science.gov (United States)

    Edelen, A. L.; Biedron, S. G.; Chase, B. E.; Edstrom, D.; Milton, S. V.; Stabile, P.

    2016-04-01

    Particle accelerators are host to myriad nonlinear and complex physical phenomena. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems, as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. The purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.

  5. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  6. Accelerator system model (ASM): A unique tool in exploring accelerator driven transmutation technologies (ADTT) system trade space

    Energy Technology Data Exchange (ETDEWEB)

    Myers, T.J.; Favale, A.J.; Berwald, D.H.; Burger, E.C.; Paulson, C.C.; Peacock, M.A.; Piaszczyk, C.M.; Piechowiak, E.M.; Rathke, J.W. [Northrop Grumman Corp., Bethpage, NY (United States). Advanced Technology and Development Center

    1997-09-01

    To aid in the development and optimization of emerging Accelerator Driven Transmutation Technology (ADTT) concepts, the Northrop Grumman Corporation, working together with G.H. Gillespie Associates and Los Alamos National Laboratory has developed a computational tool which combines both accelerator physics layout/analysis capabilities with engineering analysis capabilities to create a standardized platform to compare and contrast accelerator system configurations. In this context, the accelerator system configuration includes not only the accelerating structures, but also the major support systems such as the vacuum, thermal control, RF power, and cryogenic subsystem (if superconducting accelerator operation is investigated) as well as estimates of the costs for enclosures (accelerating tunnel and RF halls). This paper presents an overview of the Accelerator System Model (ASM) code flow, as well as a discussion of the data and analysis upon which it is based. Also presented is material which addresses the development of the evaluation criteria employed by this code including a presentation of the economic analysis methods, and a discussion of the cost database employed. The paper concludes with examples depicting completed and planned trade studies for both normal and superconducting accelerator applications. 8 figs.

  7. Accelerator simulation and theoretical modelling of radiation effects (SMoRE)

    CERN Document Server

    2018-01-01

    This publication summarizes the findings and conclusions of the IAEA coordinated research project (CRP) on accelerator simulation and theoretical modelling of radiation effects, aimed at supporting Member States in the development of advanced radiation-resistant structural materials for implementation in innovative nuclear systems. This aim can be achieved through enhancement of both experimental neutron-emulation capabilities of ion accelerators and improvement of the predictive efficiency of theoretical models and computer codes. This dual approach is challenging but necessary, because outputs of accelerator simulation experiments need adequate theoretical interpretation, and theoretical models and codes need high dose experimental data for their verification. Both ion irradiation investigations and computer modelling have been the specific subjects of the CRP, and the results of these studies are presented in this publication which also includes state-ofthe- art reviews of four major aspects of the project...

  8. Toolkit for healthcare facility design evaluation - some case studies

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  9. Toolkit for healthcare facility design evaluation - some case studies.

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  10. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    Science.gov (United States)

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  12. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    Science.gov (United States)

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  13. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  14. Google Web Toolkit for Ajax

    CERN Document Server

    Perry, Bruce

    2007-01-01

    The Google Web Toolkit (GWT) is a nifty framework that Java programmers can use to create Ajax applications. The GWT allows you to create an Ajax application in your favorite IDE, such as IntelliJ IDEA or Eclipse, using paradigms and mechanisms similar to programming a Java Swing application. After you code the application in Java, the GWT's tools generate the JavaScript code the application needs. You can also use typical Java project tools such as JUnit and Ant when creating GWT applications. The GWT is a free download, and you can freely distribute the client- and server-side code you c

  15. The Knowledge Translation Toolkit: Bridging the Know–Do Gap: A ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-06-06

    Jun 6, 2011 ... It presents the theories, tools, and strategies required to encourage and enable ... Toolkit: Bridging the Know–Do Gap: A Resource for Researchers ... violence, and make digital platforms work for inclusive development.

  16. Ice-sheet modelling accelerated by graphics cards

    Science.gov (United States)

    Brædstrup, Christian Fredborg; Damsgaard, Anders; Egholm, David Lundbek

    2014-11-01

    Studies of glaciers and ice sheets have increased the demand for high performance numerical ice flow models over the past decades. When exploring the highly non-linear dynamics of fast flowing glaciers and ice streams, or when coupling multiple flow processes for ice, water, and sediment, researchers are often forced to use super-computing clusters. As an alternative to conventional high-performance computing hardware, the Graphical Processing Unit (GPU) is capable of massively parallel computing while retaining a compact design and low cost. In this study, we present a strategy for accelerating a higher-order ice flow model using a GPU. By applying the newest GPU hardware, we achieve up to 180× speedup compared to a similar but serial CPU implementation. Our results suggest that GPU acceleration is a competitive option for ice-flow modelling when compared to CPU-optimised algorithms parallelised by the OpenMP or Message Passing Interface (MPI) protocols.

  17. Developing Climate Resilience Toolkit Decision Support Training Sectio

    Science.gov (United States)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  18. The DLESE Evaluation Toolkit Project

    Science.gov (United States)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  19. Evaluating secular acceleration in geomagnetic field model GRIMM-3

    Science.gov (United States)

    Lesur, V.; Wardinski, I.

    2012-12-01

    Secular acceleration of the magnetic field is the rate of change of its secular variation. One of the main results of studying magnetic data collected by the German survey satellite CHAMP was the mapping of field acceleration and its evolution in time. Questions remain about the accuracy of the modeled acceleration and the effect of the applied regularization processes. We have evaluated to what extent the regularization affects the temporal variability of the Gauss coefficients. We also obtained results of temporal variability of the Gauss coefficients where alternative approaches to the usual smoothing norms have been applied for regularization. Except for the dipole term, the secular acceleration of the Gauss coefficients is fairly well described up to spherical harmonic degree 5 or 6. There is no clear evidence from observatory data that the spectrum of this acceleration is underestimated at the Earth surface. Assuming a resistive mantle, the observed acceleration supports a characteristic time scale for the secular variation of the order of 11 years.

  20. A survey exploring National Health Service ePrescribing Toolkit use and perceived usefulness amongst English hospitals

    Directory of Open Access Journals (Sweden)

    Kathrin Cresswell

    2017-06-01

    Conclusions: Interactive elements and learning lessons from early adopter sites that had accumulated experiences of implementing systems was viewed as the most helpful aspect of the ePrescribing Toolkit. The Toolkit now needs to be further developed to facilitate the continuing implementation/optimisation of ePrescribing and other health information technology across the NHS.

  1. On the Radio-emitting Particles of the Crab Nebula: Stochastic Acceleration Model

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Shuta J. [Department of Physics, Faculty of Science and Engineering, Konan University, 8-9-1 Okamoto, Kobe, Hyogo 658-8501 (Japan); Asano, Katsuaki, E-mail: sjtanaka@center.konan-u.ac.jp [Institute for Cosmic Ray Research, The University of Tokyo, 5-1-5 Kashiwa-no-ha, Kashiwa City, Chiba, 277-8582 (Japan)

    2017-06-01

    The broadband emission of pulsar wind nebulae (PWNe) is well described by non-thermal emissions from accelerated electrons and positrons. However, the standard shock acceleration model of PWNe does not account for the hard spectrum in radio wavelengths. The origin of the radio-emitting particles is also important to determine the pair production efficiency in the pulsar magnetosphere. Here, we propose a possible resolution for the particle energy distribution in PWNe; the radio-emitting particles are not accelerated at the pulsar wind termination shock but are stochastically accelerated by turbulence inside PWNe. We upgrade our past one-zone spectral evolution model to include the energy diffusion, i.e., the stochastic acceleration, and apply the model to the Crab Nebula. A fairly simple form of the energy diffusion coefficient is assumed for this demonstrative study. For a particle injection to the stochastic acceleration process, we consider the continuous injection from the supernova ejecta or the impulsive injection associated with supernova explosion. The observed broadband spectrum and the decay of the radio flux are reproduced by tuning the amount of the particle injected to the stochastic acceleration process. The acceleration timescale and the duration of the acceleration are required to be a few decades and a few hundred years, respectively. Our results imply that some unveiled mechanisms, such as back reaction to the turbulence, are required to make the energies of stochastically and shock-accelerated particles comparable.

  2. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  3. Optics Elements for Modeling Electrostatic Lenses and Accelerator Components: III. Electrostatic Deflectors

    International Nuclear Information System (INIS)

    Brown, T.A.; Gillespie, G.H.

    1999-01-01

    Ion-beam optics models for simulating electrostatic prisms (deflectors) of different geometries have been developed for the computer code TRACE 3-D. TRACE 3-D is an envelope (matrix) code, which includes a linear space charge model, that was originally developed to model bunched beams in magnetic transport systems and radiofrequency (RF) accelerators. Several new optical models for a number of electrostatic lenses and accelerator columns have been developed recently that allow the code to be used for modeling beamlines and accelerators with electrostatic components. The new models include a number of options for: (1) Einzel lenses, (2) accelerator columns, (3) electrostatic prisms, and (4) electrostatic quadrupoles. A prescription for setting up the initial beam appropriate to modeling 2-D (continuous) beams has also been developed. The models for electrostatic prisms are described in this paper. The electrostatic prism model options allow the modeling of cylindrical, spherical, and toroidal electrostatic deflectors. The application of these models in the development of ion-beam transport systems is illustrated through the modeling of a spherical electrostatic analyzer as a component of the new low energy beamline at CAMS

  4. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    Science.gov (United States)

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  5. Evaluation of a server-client architecture for accelerator modeling and simulation

    International Nuclear Information System (INIS)

    Bowling, B.A.; Akers, W.; Shoaee, H.; Watson, W.; Zeijts, J. van; Witherspoon, S.

    1997-01-01

    Traditional approaches to computational modeling and simulation often utilize a batch method for code execution using file-formatted input/output. This method of code implementation was generally chosen for several factors, including CPU throughput and availability, complexity of the required modeling problem, and presentation of computation results. With the advent of faster computer hardware and the advances in networking and software techniques, other program architectures for accelerator modeling have recently been employed. Jefferson Laboratory has implemented a client/server solution for accelerator beam transport modeling utilizing a query-based I/O. The goal of this code is to provide modeling information for control system applications and to serve as a computation engine for general modeling tasks, such as machine studies. This paper performs a comparison between the batch execution and server/client architectures, focusing on design and implementation issues, performance, and general utility towards accelerator modeling demands

  6. Numerical modeling of the plasma ring acceleration experiment

    International Nuclear Information System (INIS)

    Eddleman, J.L.; Hammer, J.H.; Hartman, C.W.

    1987-01-01

    Modeling of the LLNL RACE experiment and its many applications has necessitated the development and use of a wide array of computational tools. The two-dimensional MHD code, HAM, has been used to model the formation of a compact torus plasma ring in a magnetized coaxial gun and its subsequent acceleration by an additional applied toroidal field. Features included in the 2-D calculations are self-consistent models for (1) the time-dependent poloidal field produced by a capacitor bank discharge through a solenoid field coil (located either inside the gun inner electrode or outside the outer gun electrode) and the associated diffusion of magnetic flux through neighboring conductors, (2) gas flow into the gun annular region from a simulated puffed gas valve plenum, (3) formation and motion of a current sheet produced by J x B forces resulting from discharge of the gun capacitor bank through the plasma load between the coaxial gun electrodes, (4) the subsequent stretching and reconnection of the poloidal field lines to form a compact torus plasma ring, and (5) finally the discharge of the accelerator capacitor bank producing an additional toroidal field for acceleration of the plasma ring. The code has been extended to include various models for gas breakdown, plasma anomalous resistivity, and mass entrainment from ablation of electrode material

  7. Merging AI and numerical modeling for accelerator control

    International Nuclear Information System (INIS)

    Schultz, D.E.; Silbar, R.R.

    1987-01-01

    The authors report the beginnings of an experiment to evaluate the power and limitations of artificial intelligence techniques combined with beam-line modeling for solving problems in accelerator control. Using the Knowledge Engineering Environment (KEE) system, they have built a knowledge base that describes the characteristics and the relationships of about 30 devices in a typical accelerator beam line. Each device in the line is categorized and pertinent attributes for each category are defined. Specific values for each device are assigned in the knowledge base to represent static characteristics. Device-specific slots are also provided for dynamic attributes. The definition of these slots reflects the data type and any limitations or restrictions on the range of the data. The authors model relationships between the various beam-line devices using the techniques of rules, active values, and object-oriented models. The knowledge base provides a framework for analyzing faults and offering suggestions to assist in tuning, based on information provided by the accelerator physicists (domain experts) responsible for designing and tuning this beam line. Our knowledge base has a powerful graphical interface. It allows the operator to mouse on an icon for a particular icon in the schematic of the beam line and obtain device-specific information and control over that device. The beam optics code Transport is used to model the beam line numerically. 11 refs., 7 figs

  8. Accurate modeling of the hose instability in plasma wakefield accelerators

    Science.gov (United States)

    Mehrling, T. J.; Benedetti, C.; Schroeder, C. B.; Martinez de la Ossa, A.; Osterhoff, J.; Esarey, E.; Leemans, W. P.

    2018-05-01

    Hosing is a major challenge for the applicability of plasma wakefield accelerators and its modeling is therefore of fundamental importance to facilitate future stable and compact plasma-based particle accelerators. In this contribution, we present a new model for the evolution of the plasma centroid, which enables the accurate investigation of the hose instability in the nonlinear blowout regime. It paves the road for more precise and comprehensive studies of hosing, e.g., with drive and witness beams, which were not possible with previous models.

  9. Risk assessment of chemicals in foundries: The International Chemical Toolkit pilot-project

    International Nuclear Information System (INIS)

    Ribeiro, Marcela G.; Filho, Walter R.P.

    2006-01-01

    In Brazil, problems regarding protection from hazardous substances in small-sized enterprises are similar to those observed in many other countries. Looking for a simple tool to assess and control such exposures, FUNDACENTRO has started in 2005 a pilot-project to implement the International Chemical Control Toolkit. During the series of visits to foundries, it was observed that although many changes have occurred in foundry technology, occupational exposures to silica dust and metal fumes continue to occur, due to a lack of perception of occupational exposure in the work environment. After introducing the Chemical Toolkit concept to the foundry work group, it was possible to show that the activities undertaken to improve the management of chemicals, according to its concept, will support companies in fulfilling government legislations related to chemical management, occupational health and safety, and environmental impact. In the following meetings, the foundry work group and FUNDACENTRO research team will identify 'inadequate work situations'. Based on the Chemical Toolkit, improvement measures will be proposed. Afterwards, a survey will verify the efficency of those measures in the control of hazards and consequently on the management of chemicals. This step is now in course

  10. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    Science.gov (United States)

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Merzari, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Obabko, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States); Tautges, Timothy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Solberg, Jerome [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferencz, Robert Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Whitesides, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  12. ImTK: an open source multi-center information management toolkit

    Science.gov (United States)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  13. An evaluation of a toolkit for the early detection, management, and control of carbapenemase-producing Enterobacteriaceae: a survey of acute hospital trusts in England.

    Science.gov (United States)

    Coope, C M; Verlander, N Q; Schneider, A; Hopkins, S; Welfare, W; Johnson, A P; Patel, B; Oliver, I

    2018-03-09

    Following hospital outbreaks of carbapenemase-producing Enterobacteriaceae (CPE), Public Health England published a toolkit in December 2013 to promote the early detection, management, and control of CPE colonization and infection in acute hospital settings. To examine awareness, uptake, implementation and usefulness of the CPE toolkit and identify potential barriers and facilitators to its adoption in order to inform future guidance. A cross-sectional survey of National Health Service (NHS) acute trusts was conducted in May 2016. Descriptive analysis and multivariable regression models were conducted, and narrative responses were analysed thematically and informed using behaviour change theory. Most (92%) acute trusts had a written CPE plan. Fewer (75%) reported consistent compliance with screening and isolation of CPE risk patients. Lower prioritization and weaker senior management support for CPE prevention were associated with poorer compliance. Awareness of the CPE toolkit was high and all trusts with patients infected or colonized with CPE had used the toolkit either as provided (32%), or to inform (65%) their own local CPE plan. Despite this, many respondents (80%) did not believe that the CPE toolkit guidance offered an effective means to prevent CPE or was practical to follow. CPE prevention and control requires robust IPC measures. Successful implementation can be hindered by a complex set of factors related to their practical execution, insufficient resources and a lack of confidence in the effectiveness of the guidance. Future CPE guidance would benefit from substantive user involvement, processes for ongoing feedback, and regular guidance updates. Copyright © 2018 The Healthcare Infection Society. All rights reserved.

  14. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  15. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  16. The IGUANA interactive graphics toolkit with examples from CMS and D0

    International Nuclear Information System (INIS)

    Alverson, G.; Osborne, I.; Taylor, L.; Tuura, L.

    2001-01-01

    IGUANA (Interactive Graphics for User ANAlysis) is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications, such as data browsers and detector and event visualisation programs. The IGUANA strategy is to use freely available software (e.g. Qt, SoQt, OpenInventor, OpenGL, HEPVis) and package and extend it to provide a general-purpose and experiment-independent toolkit. The authors describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA. The authors demonstrate the use of IGUANA with several applications built for CMS and D0

  17. Synergia CUDA: GPU-accelerated accelerator modeling package

    International Nuclear Information System (INIS)

    Lu, Q; Amundson, J

    2014-01-01

    Synergia is a parallel, 3-dimensional space-charge particle-in-cell accelerator modeling code. We present our work porting the purely MPI-based version of the code to a hybrid of CPU and GPU computing kernels. The hybrid code uses the CUDA platform in the same framework as the pure MPI solution. We have implemented a lock-free collaborative charge-deposition algorithm for the GPU, as well as other optimizations, including local communication avoidance for GPUs, a customized FFT, and fine-tuned memory access patterns. On a small GPU cluster (up to 4 Tesla C1070 GPUs), our benchmarks exhibit both superior peak performance and better scaling than a CPU cluster with 16 nodes and 128 cores. We also compare the code performance on different GPU architectures, including C1070 Tesla and K20 Kepler.

  18. Improving safety on rural local and tribal roads safety toolkit.

    Science.gov (United States)

    2014-08-01

    Rural roadway safety is an important issue for communities throughout the country and presents a challenge for state, local, and Tribal agencies. The Improving Safety on Rural Local and Tribal Roads Safety Toolkit was created to help rural local ...

  19. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  20. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  1. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  2. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  3. Transition Toolkit 3.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System. Third Edition

    Science.gov (United States)

    Clark, Heather Griller; Mathur, Sarup; Brock, Leslie; O'Cummings, Mindee; Milligan, DeAngela

    2016-01-01

    The third edition of the National Technical Assistance Center for the Education of Neglected or Delinquent Children and Youth's (NDTAC's) "Transition Toolkit" provides updated information on existing policies, practices, strategies, and resources for transition that build on field experience and research. The "Toolkit" offers…

  4. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  5. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  6. An extended car-following model considering the acceleration derivative in some typical traffic environments

    Science.gov (United States)

    Zhou, Tong; Chen, Dong; Liu, Weining

    2018-03-01

    Based on the full velocity difference and acceleration car-following model, an extended car-following model is proposed by considering the vehicle’s acceleration derivative. The stability condition is given by applying the control theory. Considering some typical traffic environments, the results of theoretical analysis and numerical simulation show the extended model has a more actual acceleration of string vehicles than that of the previous models in starting process, stopping process and sudden brake. Meanwhile, the traffic jams more easily occur when the coefficient of vehicle’s acceleration derivative increases, which is presented by space-time evolution. The results confirm that the vehicle’s acceleration derivative plays an important role in the traffic jamming transition and the evolution of traffic congestion.

  7. Three-dimensional electromagnetic model of the pulsed-power Z-pinch accelerator

    Directory of Open Access Journals (Sweden)

    D. V. Rose

    2010-01-01

    Full Text Available A three-dimensional, fully electromagnetic model of the principal pulsed-power components of the 26-MA ZR accelerator [D. H. McDaniel et al., in Proceedings of the 5th International Conference on Dense Z-Pinches (AIP, New York, 2002, p. 23] has been developed. This large-scale simulation model tracks the evolution of electromagnetic waves through the accelerator’s intermediate-storage capacitors, laser-triggered gas switches, pulse-forming lines, water switches, triplate transmission lines, and water convolute to the vacuum insulator stack. The insulator-stack electrodes are coupled to a transmission-line circuit model of the four-level magnetically insulated vacuum-transmission-line section and double-post-hole convolute. The vacuum-section circuit model is terminated by a one-dimensional self-consistent dynamic model of an imploding z-pinch load. The simulation results are compared with electrical measurements made throughout the ZR accelerator, and are in good agreement with the data, especially for times until peak load power. This modeling effort demonstrates that 3D electromagnetic models of large-scale, multiple-module, pulsed-power accelerators are now computationally tractable. This, in turn, presents new opportunities for simulating the operation of existing pulsed-power systems used in a variety of high-energy-density-physics and radiographic applications, as well as even higher-power next-generation accelerators before they are constructed.

  8. Dynamic Model for the Z Accelerator Vacuum Section Based on Transmission Line Code%Dynamic Model for the Z Accelerator Vacuum Section Based on Transmission Line Code

    Institute of Scientific and Technical Information of China (English)

    呼义翔; 雷天时; 吴撼宇; 郭宁; 韩娟娟; 邱爱慈; 王亮平; 黄涛; 丛培天; 张信军; 李岩; 曾正中; 孙铁平

    2011-01-01

    The transmission-line-circuit model of the Z accelerator, developed originally by W. A. STYGAR, P. A. CORCORAN, et al., is revised. The revised model uses different calculations for the electron loss and flow impedance in the magnetically insulated transmission line system of the Z accelerator before and after magnetic insulation is established. By including electron pressure and zero electric field at the cathode, a closed set of equations is obtained at each time step, and dynamic shunt resistance (used to represent any electron loss to the anode) and flow impedance are solved, which have been incorporated into the transmission line code for simulations of the vacuum section in the Z accelerator. Finally, the results are discussed in comparison with earlier findings to show the effectiveness and limitations of the model.

  9. Modelling of control system architecture for next-generation accelerators

    International Nuclear Information System (INIS)

    Liu, Shi-Yao; Kurokawa, Shin-ichi

    1990-01-01

    Functional, hardware and software system architectures define the fundamental structure of control systems. Modelling is a protocol of system architecture used in system design. This paper reviews various modellings adopted in past ten years and suggests a new modelling for next generation accelerators. (author)

  10. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Bowd

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  11. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    Science.gov (United States)

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  12. Accelerated oxygen-induced retinopathy is a reliable model of ischemia-induced retinal neovascularization.

    Science.gov (United States)

    Villacampa, Pilar; Menger, Katja E; Abelleira, Laura; Ribeiro, Joana; Duran, Yanai; Smith, Alexander J; Ali, Robin R; Luhmann, Ulrich F; Bainbridge, James W B

    2017-01-01

    Retinal ischemia and pathological angiogenesis cause severe impairment of sight. Oxygen-induced retinopathy (OIR) in young mice is widely used as a model to investigate the underlying pathological mechanisms and develop therapeutic interventions. We compared directly the conventional OIR model (exposure to 75% O2 from postnatal day (P) 7 to P12) with an alternative, accelerated version (85% O2 from P8 to P11). We found that accelerated OIR induces similar pre-retinal neovascularization but greater retinal vascular regression that recovers more rapidly. The extent of retinal gliosis is similar but neuroretinal function, as measured by electroretinography, is better maintained in the accelerated model. We found no systemic or maternal morbidity in either model. Accelerated OIR offers a safe, reliable and more rapid alternative model in which pre-retinal neovascularization is similar but retinal vascular regression is greater.

  13. Modelling of post-irradiation accelerated repopulation in squamous cell carcinomas

    International Nuclear Information System (INIS)

    Marcu, L; Doorn, T van; Olver, I

    2004-01-01

    The mechanisms postulated to be responsible for the accelerated repopulation of squamous cell carcinomas during radiotherapy are the loss of asymmetry of stem cell division, acceleration of stem cell division, abortive division and/or recruitment of the non-cycling cell with proliferative capacity. Although accelerated repopulation was observed with recruitment and accelerated cell cycles, it was not sufficient to cause an observable change to the survival curve. However, modelling the loss of asymmetry in stem cell division has reshaped the curve with a 'growth' shoulder. Cell recruitment was not found to be a major contributor to accelerated tumour repopulation. A more significant contribution was provided through the multiplication of surviving tumour stem cells during radiotherapy, by reducing their cell cycle time, and due to loss of asymmetry of stem cell division

  14. Modeling high-Power Accelerators Reliability-SNS LINAC (SNS-ORNL); MAX LINAC (MYRRHA)

    International Nuclear Information System (INIS)

    Pitigoi, A. E.; Fernandez Ramos, P.

    2013-01-01

    Improving reliability has recently become a very important objective in the field of particle accelerators. The particle accelerators in operation are constantly undergoing modifications, and improvements are implemented using new technologies, more reliable components or redundant schemes (to obtain more reliability, strength, more power, etc.) A reliability model of SNS (Spallation Neutron Source) LINAC has been developed within MAX project and analysis of the accelerator systems reliability has been performed within the MAX project, using the Risk Spectrum reliability analysis software. The analysis results have been evaluated by comparison with the SNS operational data. Results and conclusions are presented in this paper, oriented to identify design weaknesses and provide recommendations for improving reliability of MYRRHA linear accelerator. The SNS reliability model developed for the MAX preliminary design phase indicates possible avenues for further investigation that could be needed to improve the reliability of the high-power accelerators, in view of the future reliability targets of ADS accelerators.

  15. Development of an evidence-informed leisure time physical activity resource for adults with spinal cord injury: the SCI Get Fit Toolkit.

    Science.gov (United States)

    Arbour-Nicitopoulos, K P; Martin Ginis, K A; Latimer-Cheung, A E; Bourne, C; Campbell, D; Cappe, S; Ginis, S; Hicks, A L; Pomerleau, P; Smith, K

    2013-06-01

    To systematically develop an evidence-informed leisure time physical activity (LTPA) resource for adults with spinal cord injury (SCI). Canada. The Appraisal of Guidelines, Research and Evaluation (AGREE) II protocol was used to develop a toolkit to teach and encourage adults with SCI how to make smart and informed choices about being physically active. A multidisciplinary expert panel appraised the evidence and generated specific recommendations for the content of the toolkit. Pilot testing was conducted to refine the toolkit's presentation. Recommendations emanating from the consultation process were that the toolkit be a brief, evidence-based resource that contains images of adults with tetraplegia and paraplegia, and links to more detailed online information. The content of the toolkit should include the physical activity guidelines (PAGs) for adults with SCI, activities tailored to manual and power chair users, the benefits of LTPA, and strategies to overcome common LTPA barriers for adults with SCI. The inclusion of action plans and safety tips was also recommended. These recommendations have resulted in the development of an evidence-informed LTPA resource to assist adults with SCI in meeting the PAGs. This toolkit will have important implications for consumers, health care professionals and policy makers for encouraging LTPA in the SCI community.

  16. Implications of accelerator experiments for models of the Kolar Gold Mine particles

    Energy Technology Data Exchange (ETDEWEB)

    Sarma, K V.L. [Tata Inst. of Fundamental Research, Bombay (India); Wolfenstein, L [Carnegie-Mellon Univ., Pittsburgh, Pa. (USA)

    1976-03-01

    The significance of accelerator searches for the new particles discovered in the Kolar Gold Mine experiments depends on the characteristics of the models of these particles. Models that could give cosmic ray neutrinos a great advantage over accelerator neutrinos are presented. The new particles should be produced in e/sup +/e/sup -/ colliding beams, but the cross-section is model dependent.

  17. Mathematical model of two-phase flow in accelerator channel

    Directory of Open Access Journals (Sweden)

    О.Ф. Нікулін

    2010-01-01

    Full Text Available  The problem of  two-phase flow composed of energy-carrier phase (Newtonian liquid and solid fine-dispersed phase (particles in counter jet mill accelerator channel is considered. The mathematical model bases goes on the supposition that the phases interact with each other like independent substances by means of aerodynamics’ forces in conditions of adiabatic flow. The mathematical model in the form of system of differential equations of order 11 is represented. Derivations of equations by base physical principles for cross-section-averaged quantity are produced. The mathematical model can be used for estimation of any kinematic and thermodynamic flow characteristics for purposely parameters optimization problem solving and transfer functions determination, that take place in  counter jet mill accelerator channel design.

  18. A Gateway MultiSite recombination cloning toolkit.

    Directory of Open Access Journals (Sweden)

    Lena K Petersen

    Full Text Available The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org.

  19. A model for particle acceleration in lower hybrid collapse

    International Nuclear Information System (INIS)

    Retterer, J.M.

    1997-01-01

    A model for particle acceleration during the nonlinear collapse of lower hybrid waves is described. Using the Musher-Sturman wave equation to describe the effects of nonlinear processes and a velocity diffusion equation for the particle velocity distribution, the model self-consistently describes the exchange of energy between the fields and the particles in the local plasma. Two-dimensional solutions are presented for the modulational instability of a plane wave and the collapse of a cylindrical wave packet. These calculations were motivated by sounding rocket observations in the vicinity of auroral arcs in the Earth close-quote s ionosphere, which have revealed the existence of large-amplitude lower-hybrid wave packets associated with ions accelerated to energies of 100 eV. The scaling of the sizes of these wave packets is consistent with the theory of lower-hybrid collapse and the observed lower-hybrid field amplitudes are adequate to accelerate the ionospheric ions to the observed energies

  20. A General Accelerated Degradation Model Based on the Wiener Process.

    Science.gov (United States)

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  1. A General Accelerated Degradation Model Based on the Wiener Process

    Directory of Open Access Journals (Sweden)

    Le Liu

    2016-12-01

    Full Text Available Accelerated degradation testing (ADT is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  2. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  3. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  4. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    Science.gov (United States)

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  5. A New Car-Following Model considering Driving Characteristics and Preceding Vehicle’s Acceleration

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available In the past decades, many improved car-following models based on the full velocity difference (FVD model have been developed. But these models do not consider the acceleration of leading vehicle. Some of them consider individual anticipation behavior of drivers, but they either do not quantitatively determine the types of driving or artificially divide the driving types rather than deriving them from actual traffic data. In this paper, driver’s driving styles are firstly categorized based on actual traffic data via data mining and clustering algorithm. Secondly, a new car-following model based on FVD model is developed, taking into account individual anticipation effects and the acceleration of leading vehicle. The effect of driving characteristics and leading vehicle’s acceleration on car-following behavior is further analyzed via numerical simulation. The results show that considering the acceleration of preceding vehicle in the model improves the stability of traffic flow and different driving characteristics have different influence on the stability of traffic flow.

  6. Physiologically based pharmacokinetic toolkit to evaluate environmental exposures: Applications of the dioxin model to study real life exposures

    Energy Technology Data Exchange (ETDEWEB)

    Emond, Claude, E-mail: claude.emond@biosmc.com [BioSimulation Consulting Inc, Newark, DE (United States); Ruiz, Patricia; Mumtaz, Moiz [Division of Toxicology and Human Health Sciences, Agency for Toxic Substances and Disease Registry, Atlanta, GA (United States)

    2017-01-15

    Chlorinated dibenzo-p-dioxins (CDDs) are a series of mono- to octa-chlorinated homologous chemicals commonly referred to as polychlorinated dioxins. One of the most potent, well-known, and persistent member of this family is 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). As part of translational research to make computerized models accessible to health risk assessors, we present a Berkeley Madonna recoded version of the human physiologically based pharmacokinetic (PBPK) model used by the U.S. Environmental Protection Agency (EPA) in the recent dioxin assessment. This model incorporates CYP1A2 induction, which is an important metabolic vector that drives dioxin distribution in the human body, and it uses a variable elimination half-life that is body burden dependent. To evaluate the model accuracy, the recoded model predictions were compared with those of the original published model. The simulations performed with the recoded model matched well with those of the original model. The recoded model was then applied to available data sets of real life exposure studies. The recoded model can describe acute and chronic exposures and can be useful for interpreting human biomonitoring data as part of an overall dioxin and/or dioxin-like compounds risk assessment. - Highlights: • The best available dioxin PBPK model for interpreting human biomonitoring data is presented. • The original PBPK model was recoded from acslX to the Berkeley Madonna (BM) platform. • Comparisons were made of the accuracy of the recoded model with the original model. • The model is a useful addition to the ATSDR's BM based PBPK toolkit that supports risk assessors. • The application of the model to real-life exposure data sets is illustrated.

  7. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  8. A Model of RHIC Using the Unified Accelerator Libraries

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tepikian, S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Trahern, C. G. [Brookhaven National Lab. (BNL), Upton, NY (United States); Malitsky, N. [Cornell Univ., Ithaca, NY (United States)

    1998-01-01

    The Unified Accelerator Library (UAL) is an object oriented and modular software environment for accelerator physics which comprises an accelerator object model for the description of the machine (SMF, for Standard Machine Format), a collection of Physics Libraries, and a Perl inte,face that provides a homo­geneous shell for integrating and managing these components. Currently available physics libraries include TEAPOT++, a collection of C++ physics modules conceptually derived from TEAPOT, and DNZLIB, a differential algebra package for map generation. This software environment has been used to build a flat model of RHIC which retains the hierarchical lat­tice description while assigning specific characteristics to individual elements, such as measured field har­monics. A first application of the model and of the simulation capabilities of UAL has been the study of RHIC stability in the presence of siberian snakes and spin rotators. The building blocks of RHIC snakes and rotators are helical dipoles, unconventional devices that can not be modeled by traditional accelerator phys­ics codes and have been implemented in UAL as Taylor maps. Section 2 describes the RHIC data stores, Section 3 the RHIC SMF format and Section 4 the RHIC spe­cific Perl interface (RHIC Shell). Section 5 explains how the RHIC SMF and UAL have been used to study the RHIC dynamic behavior and presents detuning and dynamic aperture results. If the reader is not familiar with the motivation and characteristics of UAL, we include in the Appendix an useful overview paper. An example of a complete set of Perl Scripts for RHIC simulation can also be found in the Appendix.

  9. Lessons learned on the Ground Test Accelerator control system

    International Nuclear Information System (INIS)

    Kozubal, A.J.; Weiss, R.E.

    1994-01-01

    When we initiated the control system design for the Ground Test Accelerator (GTA), we envisioned a system that would be flexible enough to handle the changing requirements of an experimental project. This control system would use a developers' toolkit to reduce the cost and time to develop applications for GTA, and through the use of open standards, the system would accommodate unforeseen requirements as they arose. Furthermore, we would attempt to demonstrate on GTA a level of automation far beyond that achieved by existing accelerator control systems. How well did we achieve these goals? What were the stumbling blocks to deploying the control system, and what assumptions did we make about requirements that turned out to be incorrect? In this paper we look at the process of developing a control system that evolved into what is now the ''Experimental Physics and Industrial Control System'' (EPICS). Also, we assess the impact of this system on the GTA project, as well as the impact of GTA on EPICS. The lessons learned on GTA will be valuable for future projects

  10. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  11. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    Directory of Open Access Journals (Sweden)

    Hamelryck Thomas

    2010-03-01

    Full Text Available Abstract Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs. It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations. Results The program package is freely available under the GNU General Public Licence (GPL from SourceForge http://sourceforge.net/projects/mocapy. The package contains the source for building the Mocapy++ library, several usage examples and the user manual. Conclusions Mocapy++ is especially suitable for constructing probabilistic models of biomolecular structure, due to its support for directional statistics. In particular, it supports the Kent distribution on the sphere and the bivariate von Mises distribution on the torus. These distributions have proven useful to formulate probabilistic models of protein and RNA structure in atomic detail.

  12. An Ethical Toolkit for Food Companies: Reflection on its Use

    NARCIS (Netherlands)

    Deblonde, M.K.; Graaff, R.; Brom, F.W.A.

    2007-01-01

    Nowadays many debates are going on that relate to the agricultural and food sector. It looks as if present technological and organizational developments within the agricultural and food sector are badly geared to societal needs and expectations. In this article we briefly present a toolkit for moral

  13. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    Science.gov (United States)

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  14. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  15. Finite difference time domain modelling of particle accelerators

    International Nuclear Information System (INIS)

    Jurgens, T.G.; Harfoush, F.A.

    1989-03-01

    Finite Difference Time Domain (FDTD) modelling has been successfully applied to a wide variety of electromagnetic scattering and interaction problems for many years. Here the method is extended to incorporate the modelling of wake fields in particle accelerators. Algorithmic comparisons are made to existing wake field codes, such as MAFIA T3. 9 refs., 7 figs

  16. Srijan: a graphical toolkit for sensor network macroprogramming

    OpenAIRE

    Pathak , Animesh; Gowda , Mahanth K.

    2009-01-01

    International audience; Macroprogramming is an application development technique for wireless sensor networks (WSNs) where the developer specifies the behavior of the system, as opposed to that of the constituent nodes. In this proposed demonstration, we would like to present Srijan, a toolkit that enables application development for WSNs in a graphical manner using data-driven macroprogramming. It can be used in various stages of application development, viz. i) specification of application ...

  17. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  18. Final Report for 'Modeling Electron Cloud Diagnostics for High-Intensity Proton Accelerators'

    International Nuclear Information System (INIS)

    Veitzer, Seth A.

    2009-01-01

    Electron clouds in accelerators such as the ILC degrade beam quality and limit operating efficiency. The need to mitigate electron clouds has a direct impact on the design and operation of these accelerators, translating into increased cost and reduced performance. Diagnostic techniques for measuring electron clouds in accelerating cavities are needed to provide an assessment of electron cloud evolution and mitigation. Accurate numerical modeling of these diagnostics is needed to validate the experimental techniques. In this Phase I, we developed detailed numerical models of microwave propagation through electron clouds in accelerating cavities with geometries relevant to existing and future high-intensity proton accelerators such as Project X and the ILC. Our numerical techniques and simulation results from the Phase I showed that there was a high probability of success in measuring both the evolution of electron clouds and the effects of non-uniform electron density distributions in Phase II.

  19. Modeling of ion acceleration through drift and diffusion at interplanetary shocks

    Science.gov (United States)

    Decker, R. B.; Vlahos, L.

    1986-01-01

    A test particle simulation designed to model ion acceleration through drift and diffusion at interplanetary shocks is described. The technique consists of integrating along exact particle orbits in a system where the angle between the shock normal and mean upstream magnetic field, the level of magnetic fluctuations, and the energy of injected particles can assume a range of values. The technique makes it possible to study time-dependent shock acceleration under conditions not amenable to analytical techniques. To illustrate the capability of the numerical model, proton acceleration was considered under conditions appropriate for interplanetary shocks at 1 AU, including large-amplitude transverse magnetic fluctuations derived from power spectra of both ambient and shock-associated MHD waves.

  20. Characteristics of the magnetic wall reflection model on ion acceleration in gas-puff z pinch

    International Nuclear Information System (INIS)

    Nishio, M.; Takasugi, K.

    2013-01-01

    The magnetic wall reflection model was examined with the numerical simulation of the trajectory calculation of particles. This model is for the ions accelerated by some current-independent mechanism. The trajectory calculation showed angle dependency of highest velocities of accelerated particles. This characteristics is of the magnetic wall reflection model, not of the other current-independent acceleration mechanism. Thomson parabola measurements of accelerated ions produced in the gas-puff z-pinch experiments were carried out for the verification of the angle dependency. (author)

  1. The "Pesticides and Farmworker Health Toolkit" : An Innovative Model for Developing an Evidence-Informed Program for a Low-Literacy, Latino Immigrant Audience

    Science.gov (United States)

    LePrevost, Catherine E.; Storm, Julia F.; Asuaje, Cesar R.; Cope, W. Gregory

    2014-01-01

    Migrant and seasonal farmworkers are typically Spanish-speaking, Latino immigrants with limited formal education and low literacy skills and, as such, are a vulnerable population. We describe the development of the "Pesticides and Farmworker Health Toolkit", a pesticide safety and health curriculum designed to communicate to farmworkers…

  2. Numerical modeling of accelerated, pre-compressed CTs in RACE

    International Nuclear Information System (INIS)

    Eddleman, J.L.; Hammer, J.H.; Hartman, C.W.; Logan, B.G.; McLean, H.S.; Molvik, A.W.

    1990-01-01

    Numerical modeling of accelerated compact toroids in the RACE experiment has motivated the development and application of a wide range of computational tools. These tools have included the zero-dimensional RAC code for fast parameter and design studies, and the two-dimensional, Eulerian, axisymmetric, magneto-hydrodynamic code, HAM, used to model plasma ring formation in magnetized plasma guns and acceleration in straight cylindrical electrodes. Extension of the RACE geometry to include converging conical electrodes motivated the development of a new two-dimensional, Lagrangian, axisymmetric, magnetohydrodynamic code, TRAC. The code includes optional initialization of the ring magnetic fields to a Taylor-equilibrium profile as well as self-consistent external capacitor bank driving circuit. Stability of initial field configurations with toroidal mode number > 0 may also be determined. The new code is particularly suited for predicting the behavior of accelerated plasma rings in arbitrarily shaped conical electrodes, since the restriction to a rectilinear mesh is removed. In particular, application of the code to the new pre-compression geometry in the RACE experiment is discussed and compared with experimental results

  3. A patient and public involvement (PPI) toolkit for meaningful and flexible involvement in clinical trials - a work in progress.

    Science.gov (United States)

    Bagley, Heather J; Short, Hannah; Harman, Nicola L; Hickey, Helen R; Gamble, Carrol L; Woolfall, Kerry; Young, Bridget; Williamson, Paula R

    2016-01-01

    Funders of research are increasingly requiring researchers to involve patients and the public in their research. Patient and public involvement (PPI) in research can potentially help researchers make sure that the design of their research is relevant, that it is participant friendly and ethically sound. Using and sharing PPI resources can benefit those involved in undertaking PPI, but existing PPI resources are not used consistently and this can lead to duplication of effort. This paper describes how we are developing a toolkit to support clinical trials teams in a clinical trials unit. The toolkit will provide a key 'off the shelf' resource to support trial teams with limited resources, in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: ● listen to the views and experience of both research teams and patient and public contributors who use the tools; ● modify the tools based on our experience of using them; ● identify the need for future tools; ● update the toolkit based on any newly identified resources that come to light; ● raise awareness of the toolkit and ● work in collaboration with others to either develop or test out PPI resources in order to reduce duplication of work in PPI. Background Patient and public involvement (PPI) in research is increasingly a funder requirement due to the potential benefits in the design of relevant, participant friendly, ethically sound research. The use and sharing of resources can benefit PPI, but available resources are not consistently used leading to duplication of effort. This paper describes a developing toolkit to support clinical trials teams to undertake effective and meaningful PPI. Methods The first phase in developing the toolkit was to describe which PPI activities should be considered in the pathway of a clinical trial and at what stage these activities should take place. This pathway was informed through review of the type and timing of PPI activities within

  4. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  6. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  7. DUL Radio: A light-weight, wireless toolkit for sketching in hardware

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2011-01-01

    -mobile prototyping where fast reaction is needed (e.g. in controlling sound). The target audiences include designers, students, artists etc. with minimal programming and hardware skills. This presentation covers our motivations for creating the toolkit, specifications, test results, comparison to related products...

  8. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  9. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  10. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    Science.gov (United States)

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.

  11. Linear accelerator modeling: development and application

    International Nuclear Information System (INIS)

    Jameson, R.A.; Jule, W.D.

    1977-01-01

    Most of the parameters of a modern linear accelerator can be selected by simulating the desired machine characteristics in a computer code and observing how the parameters affect the beam dynamics. The code PARMILA is used at LAMPF for the low-energy portion of linacs. Collections of particles can be traced with a free choice of input distributions in six-dimensional phase space. Random errors are often included in order to study the tolerances which should be imposed during manufacture or in operation. An outline is given of the modifications made to the model, the results of experiments which indicate the validity of the model, and the use of the model to optimize the longitudinal tuning of the Alvarez linac

  12. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  13. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  14. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  15. Sierra Toolkit Manual Version 4.48.

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Toolkit Team

    2018-03-01

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall out of date. This page intentionally left blank.

  16. RAVE-a Detector-independent vertex reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, Wolfgang [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at; Mitaroff, Winfried; Moser, Fabian [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)

    2007-10-21

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  17. RAVE-a Detector-independent vertex reconstruction toolkit

    International Nuclear Information System (INIS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-01-01

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available

  18. Water Security Toolkit User Manual: Version 1.3 | Science ...

    Science.gov (United States)

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  19. The CEBAF accelerator control system: migrating from a TACL to an EPICS based system

    International Nuclear Information System (INIS)

    Watson, W.A. III; Barker, David; Bickley, Matthew; Gupta, Pratik; Johnson, R.P.

    1994-01-01

    CEBAF is in the process of migrating its accelerator and experimental hall control systems to one based upon EPICS, a control system toolkit developed by a collaboration among several DOE laboratories in the US. The new system design interfaces existing CAMAC hardware via a CAMAC serial highway to VME-based I/O controllers running the EPICS software; future additions and upgrades will for the most part go directly into VME. The decision to use EPICS followed difficulties in scaling the CEBAF-developed TACL system to full machine operation. TACL and EPICS share many design concepts, facilitating the conversion of software from one toolkit to the other. In particular, each supports graphical entry of algorithms built up from modular code, graphical displays with a display editor, and a client-server architecture with name-based I/O. During the migration, TACL and EPICS will interoperate through a socket-based I/O gateway. As part of a collaboration with other laboratories, CEBAF will add relational database support for system management and high level applications support. Initial experience with EPICS is presented, along with a plan for the full migration which is expected to be finished next year. ((orig.))

  20. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    Science.gov (United States)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  1. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  2. Accounting for measurement error in log regression models with applications to accelerated testing.

    Science.gov (United States)

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  3. Accounting for measurement error in log regression models with applications to accelerated testing.

    Directory of Open Access Journals (Sweden)

    Robert Richardson

    Full Text Available In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  4. Development of a RAMI model for LANSCE and high power APT accelerators

    International Nuclear Information System (INIS)

    Tallerico, P.J.

    1994-01-01

    Assessment of the reliability, availability, maintainability and inspectability (RAMI) of all high power, high cost systems is important to justify and improve the cost effectiveness of these systems. For the very large (over 100 MW) accelerator systems associated with APT, a RAMI model is very valuable in guiding the design and allocation of resources. A RAMI model of an existing machine is also valuable, since machine improvement funds must be allocated to increase the availability by the largest amount. The authors have developed a RAMI model using the critical subsystems of the LANSCE accelerator and beam delivery complex as an example and to evaluate the effectiveness for estimating reliability and beam availability. LAMPF and LANSCE together provide most of the features required for the accelerator and beam delivery part of a high-power APT machine, but LANSCE is pulsed, rather than CW. This complex is capable of a 1-MW average power H - beam, and it is the most powerful proton accelerator in the US built to date

  5. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  6. An analytical reconstruction model of the spread-out Bragg peak using laser-accelerated proton beams.

    Science.gov (United States)

    Tao, Li; Zhu, Kun; Zhu, Jungao; Xu, Xiaohan; Lin, Chen; Ma, Wenjun; Lu, Haiyang; Zhao, Yanying; Lu, Yuanrong; Chen, Jia-Er; Yan, Xueqing

    2017-07-07

    With the development of laser technology, laser-driven proton acceleration provides a new method for proton tumor therapy. However, it has not been applied in practice because of the wide and decreasing energy spectrum of laser-accelerated proton beams. In this paper, we propose an analytical model to reconstruct the spread-out Bragg peak (SOBP) using laser-accelerated proton beams. Firstly, we present a modified weighting formula for protons of different energies. Secondly, a theoretical model for the reconstruction of SOBPs with laser-accelerated proton beams has been built. It can quickly calculate the number of laser shots needed for each energy interval of the laser-accelerated protons. Finally, we show the 2D reconstruction results of SOBPs for laser-accelerated proton beams and the ideal situation. The final results show that our analytical model can give an SOBP reconstruction scheme that can be used for actual tumor therapy.

  7. Accelerating Atmospheric Modeling Through Emerging Multi-core Technologies

    OpenAIRE

    Linford, John Christian

    2010-01-01

    The new generations of multi-core chipset architectures achieve unprecedented levels of computational power while respecting physical and economical constraints. The cost of this power is bewildering program complexity. Atmospheric modeling is a grand-challenge problem that could make good use of these architectures if they were more accessible to the average programmer. To that end, software tools and programming methodologies that greatly simplify the acceleration of atmospheric modeling...

  8. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  9. Falling Less in Kansas: Development of a Fall Risk Reduction Toolkit

    Directory of Open Access Journals (Sweden)

    Teresa S. Radebaugh

    2011-01-01

    Full Text Available Falls are a serious health risk for older adults. But for those living in rural and frontier areas of the USA, the risks are higher because of limited access to health care providers and resources. This study employed a community-based participatory research approach to develop a fall prevention toolkit to be used by residents of rural and frontier areas without the assistance of health care providers. Qualitative data were gathered from both key informant interviews and focus groups with a broad range of participants. Data analysis revealed that to be effective and accepted, the toolkit should be not only evidence based but also practical, low-cost, self-explanatory, and usable without the assistance of a health care provider. Materials must be engaging, visually interesting, empowering, sensitive to reading level, and appropriate for low-vision users. These findings should be useful to other researchers developing education and awareness materials for older adults in rural areas.

  10. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  11. Acceleration (Deceleration Model Supporting Time Delays to Refresh Data

    Directory of Open Access Journals (Sweden)

    José Gerardo Carrillo González

    2018-04-01

    Full Text Available This paper proposes a mathematical model to regulate the acceleration (deceleration applied by self-driving vehicles in car-following situations. A virtual environment is designed to test the model in different circumstances: (1 the followers decelerate in time if the leader decelerates, considering a time delay of up to 5 s to refresh data (vehicles position coordinates required by the model, (2 with the intention of optimizing space, the vehicles are grouped in platoons, where 3 s of time delay (to update data is supported if the vehicles have a centre-to-centre spacing of 20 m and a time delay of 1 s is supported at a spacing of 6 m (considering a maximum speed of 20 m/s in both cases, and (3 an algorithm is presented to manage the vehicles’ priority at a traffic intersection, where the model regulates the vehicles’ acceleration (deceleration and a balance in the number of vehicles passing from each side is achieved.

  12. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  13. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  14. Achievements in ISICs/SAPP collaborations for electromagnetic modeling of accelerators

    International Nuclear Information System (INIS)

    Lee Liequan; Ge Lixin; Li Zenghai; Ng, Cho; Schussman, Greg; Ko, Kwok

    2005-01-01

    SciDAC provides the unique opportunity and the resources for the Electromagnetic System Simulations (ESS) component of High Energy Physics (HEP)'s Accelerator Science and Technology (AST) project to work with researchers in the Integrated Software Infrastructure Centres (ISICs) and Scientific Application Pilot Program (SAPP) to overcome challenging barriers in computer science and applied mathematics in order to perform the large-scale simulations required to support the ongoing R and D efforts on accelerators across the Office of Science. This paper presents the resultant achievements made under SciDAC in important areas of computational science relevant to electromagnetic modelling of accelerators which include nonlinear eigensolvers, shape optimization, adaptive mesh refinement, parallel meshing, and visualization

  15. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  16. 3rd International Conference on Particle Physics Beyond the Standard Model : Accelerator, Non-Accelerator and Space Approaches

    CERN Document Server

    Beyond The Desert 2002

    2003-01-01

    The third conference on particle physics beyond the Standard Model (BEYOND THE DESERT'02 - Accelerator, Non-accelerator and Space Approaches) was held during 2--7 June, 2002 at the Finish town of Oulu, almost at the northern Arctic Circle. It was the first of the BEYOND conference series held outside Germany (CERN Courier March 2003, pp. 29-30). Traditionally the Scientific Programme of BEYOND conferences, brought into life in 1997 (see CERN Courier, November 1997, pp.16-18), covers almost all topics of modern particle physics (see contents).

  17. Comparison of pellet acceleration model results to experimentally observed penetration depths

    Energy Technology Data Exchange (ETDEWEB)

    Szepesi, T., E-mail: szepesi.tamas@gmail.co [KFKI - Research Institute for Particle and Nuclear Physics, EURATOM Association, MTA KFKI-RMKI, P.O. Box 49, H-1525 Budapest-114 (Hungary); Kalvin, S.; Kocsis, G. [KFKI - Research Institute for Particle and Nuclear Physics, EURATOM Association, MTA KFKI-RMKI, P.O. Box 49, H-1525 Budapest-114 (Hungary); Lang, P.T. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstr. 2, 85748 Garching (Germany); Senichenkov, I. [Saint Petersburg State Polytechnical University, Polytehnicheskaya 29, 195251 St. Petersburg (Russian Federation)

    2009-06-15

    Cryogenic hydrogen isotope fuelling pellets were observed to undergo strong radial acceleration in the confined plasma. The reason for pellet acceleration is believed to originate from drift effects: the ionised part of pellet cloud is affected by the grad-B drift, therefore, the cloud becomes polarised. The E x B drift then deforms the pellet cloud so that it can no longer follow the original flux bundle - this results in a less efficient shielding on the pellet's HFS region, where the subsequently enhanced ablation pushes the pellet towards LFS, like a rocket. In order to study this effect, a simple and a comprehensive ablation model was developed. Results from both models show quantitatively acceptable agreement with ASDEX-Upgrade experiments concerning trajectory curvature, corresponding to radial acceleration in the range of 10{sup 4}-10{sup 7} m/s{sup 2}.

  18. OpenFOAM Modeling of Particle Heating and Acceleration in Cold Spraying

    Science.gov (United States)

    Leitz, K.-H.; O'Sullivan, M.; Plankensteiner, A.; Kestler, H.; Sigl, L. S.

    2018-01-01

    In cold spraying, a powder material is accelerated and heated in the gas flow of a supersonic nozzle to velocities and temperatures that are sufficient to obtain cohesion of the particles to a substrate. The deposition efficiency of the particles is significantly determined by their velocity and temperature. Particle velocity correlates with the amount of kinetic energy that is converted to plastic deformation and thermal heating. The initial particle temperature significantly influences the mechanical properties of the particle. Velocity and temperature of the particles have nonlinear dependence on the pressure and temperature of the gas at the nozzle entrance. In this contribution, a simulation model based on the reactingParcelFoam solver of OpenFOAM is presented and applied for an analysis of particle velocity and temperature in the cold spray nozzle. The model combines a compressible description of the gas flow in the nozzle with a Lagrangian particle tracking. The predictions of the simulation model are verified based on an analytical description of the gas flow, the particle acceleration and heating in the nozzle. Based on experimental data, the drag model according to Plessis and Masliyah is identified to be best suited for OpenFOAM modeling particle heating and acceleration in cold spraying.

  19. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  20. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    Science.gov (United States)

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  1. The Python ARM Radar Toolkit (Py-ART, a Library for Working with Weather Radar Data in the Python Programming Language

    Directory of Open Access Journals (Sweden)

    Jonathan J Helmus

    2016-07-01

    Full Text Available The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. The source code for the toolkit is available on GitHub and is distributed under a BSD license.

  2. Back-reaction and effective acceleration in generic LTB dust models

    International Nuclear Information System (INIS)

    Sussman, Roberto A

    2011-01-01

    We provide a thorough examination of the conditions for the existence of back-reaction and an 'effective' acceleration (in the context of Buchert's averaging formalism) in regular generic spherically symmetric Lemaitre-Tolman-Bondi (LTB) dust models. By considering arbitrary spherical comoving domains,we verify rigorously the fulfillment of these conditions expressed in terms of suitable scalar variables that are evaluated at the boundary of every domain. Effective deceleration necessarily occurs in all domains in (a) the asymptotic radial range of models converging to a FLRW background (b) the asymptotic time range of non-vacuum hyperbolic models (c) LTB self-similar solutions and (d) near a simultaneous big bang. Accelerating domains are proven to exist in the following scenarios: (i) central vacuum regions(ii) central (non-vacuum) density voids (iii) the intermediate radial range of models converging to a FLRW background (iv) the asymptotic radial range of models converging to a Minkowski vacuum and (v) domains near and or intersecting a non-simultaneous big bang. All these scenarios occur in hyperbolic models with negative averaged and local spatial curvature though scenarios (iv) and (v) are also possible in low density regions of a class of elliptic models in which the local spatial curvature is negative but its average is positive. Rough numerical estimates between -0.003 and -0.5 were found for the effective deceleration parameter. While the existence of accelerating domains cannot be ruled out in models converging to an Einstein-de Sitter background and in domains undergoing gravitational collapse the conditions for this are very restrictive. The results obtained may provide important theoretical clues on the effects of back-reaction and averaging in more general non-spherical models. Communicated by L Andersson (paper)

  3. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  4. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  5. Study of particle transport in a high power spallation target for an accelerator-driven transmutation system

    International Nuclear Information System (INIS)

    Shetty, Nikhil Vittal

    2013-01-01

    AGATE is a project envisaged to demonstrate the feasibility of transmutation in a gas (helium) cooled accelerator-driven system using solid spallation target. Development of the spallation target module and assessing its safety aspects are studied in this work. According to the AGATE concept parameters, 600 MeV protons are delivered on to the segmented tungsten spallation target. The Monte Carlo toolkit Geant4 has been used in the simulation of particle transport. Binary cascade is used to simulate intra-nuclear cascades, along with the G4NDL neutron data library for low energy neutrons (<20 MeV).

  6. Study of particle transport in a high power spallation target for an accelerator-driven transmutation system

    Energy Technology Data Exchange (ETDEWEB)

    Shetty, Nikhil Vittal

    2013-01-31

    AGATE is a project envisaged to demonstrate the feasibility of transmutation in a gas (helium) cooled accelerator-driven system using solid spallation target. Development of the spallation target module and assessing its safety aspects are studied in this work. According to the AGATE concept parameters, 600 MeV protons are delivered on to the segmented tungsten spallation target. The Monte Carlo toolkit Geant4 has been used in the simulation of particle transport. Binary cascade is used to simulate intra-nuclear cascades, along with the G4NDL neutron data library for low energy neutrons (<20 MeV).

  7. Verification of mechanistic-empirical design models for flexible pavements through accelerated pavement testing.

    Science.gov (United States)

    2014-08-01

    The Midwest States Accelerated Pavement Testing Pooled Fund Program, financed by the highway : departments of Kansas, Iowa, and Missouri, has supported an accelerated pavement testing (APT) project to : validate several models incorporated in the NCH...

  8. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  9. SwingStates: adding state machines to the swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2006-01-01

    International audience; This article describes SwingStates, a library that adds state machines to the Java Swing user interface toolkit. Unlike traditional approaches, which use callbacks or listeners to define interaction, state machines provide a powerful control structure and localize all of the interaction code in one place. SwingStates takes advantage of Java's inner classes, providing programmers with a natural syntax and making it easier to follow and debug the resulting code. SwingSta...

  10. Random Walk Model for Cell-To-Cell Misalignments in Accelerator Structures

    International Nuclear Information System (INIS)

    Stupakov, Gennady

    2000-01-01

    Due to manufacturing and construction errors, cells in accelerator structures can be misaligned relative to each other. As a consequence, the beam generates a transverse wakefield even when it passes through the structure on axis. The most important effect is the long-range transverse wakefield that deflects the bunches and causes growth of the bunch train projected emittance. In this paper, the effect of the cell-to-cell misalignments is evaluated using a random walk model that assumes that each cell is shifted by a random step relative to the previous one. The model is compared with measurements of a few accelerator structures

  11. EURADOS intercomparison exercise on Monte Carlo modelling of a medical linear accelerator.

    Science.gov (United States)

    Caccia, Barbara; Le Roy, Maïwenn; Blideanu, Valentin; Andenna, Claudio; Arun, Chairmadurai; Czarnecki, Damian; El Bardouni, Tarek; Gschwind, Régine; Huot, Nicolas; Martin, Eric; Zink, Klemens; Zoubair, Mariam; Price, Robert; de Carlan, Loïc

    2017-01-01

    In radiotherapy, Monte Carlo (MC) methods are considered a gold standard to calculate accurate dose distributions, particularly in heterogeneous tissues. EURADOS organized an international comparison with six participants applying different MC models to a real medical linear accelerator and to one homogeneous and four heterogeneous dosimetric phantoms. The aim of this exercise was to identify, by comparison of different MC models with a complete experimental dataset, critical aspects useful for MC users to build and calibrate a simulation and perform a dosimetric analysis. Results show on average a good agreement between simulated and experimental data. However, some significant differences have been observed especially in presence of heterogeneities. Moreover, the results are critically dependent on the different choices of the initial electron source parameters. This intercomparison allowed the participants to identify some critical issues in MC modelling of a medical linear accelerator. Therefore, the complete experimental dataset assembled for this intercomparison will be available to all the MC users, thus providing them an opportunity to build and calibrate a model for a real medical linear accelerator.

  12. Modeling of a new 2D Acceleration Sensor Array using SystemC-AMS

    International Nuclear Information System (INIS)

    Markert, Erik; Dienel, Marco; Herrmann, Goeran; Mueller, Dietmar; Heinkel, Ulrich

    2006-01-01

    This paper presents an approach for modeling and simulation of a new 2D acceleration sensor array using SystemC-AMS. The sensor array consists of six single acceleration sensors with different detection axes. These single sensors comprise of four capacitive segments and one mass segment, aligned in a semicircle. The redundant sensor information is used for offset correction. Modeling of the single sensors is achieved using sensor structure simplification into 11 points and analytic equations for capacity changes, currents and torques. This model was expanded by a PWM feedback circuit to keep the sensor displacement in a linear region. In this paper the single sensor model is duplicated considering different positions of the seismic mass resulting in different detection axes for the single sensors. The measured accelerations of the sensors are merged with different weights depending on the orientation. This also reduces calculation effort

  13. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Directory of Open Access Journals (Sweden)

    Kota Kasahara

    Full Text Available Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML, which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  14. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Science.gov (United States)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  15. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  16. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    International Nuclear Information System (INIS)

    Cirrone, G.A.P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M.G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility

  17. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    Science.gov (United States)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  18. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    Energy Technology Data Exchange (ETDEWEB)

    Cirrone, G.A.P. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Cuttone, G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Di Rosa, F. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Raffaele, L. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Russo, G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Guatelli, S. [Istituto Nazionale di Fisica Nucleare, Sezione di Genova, Via Dodecaneso 33, Genova (Italy); Pia, M.G. [Istituto Nazionale di Fisica Nucleare, Sezione di Genova, Via Dodecaneso 33, Genova (Italy)

    2006-01-15

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  19. Goodwin accelerator model revisited with fixed time delays

    Science.gov (United States)

    Matsumoto, Akio; Merlone, Ugo; Szidarovszky, Ferenc

    2018-05-01

    Dynamics of Goodwin's accelerator business cycle model is reconsidered. The model is characterized by a nonlinear accelerator and an investment time delay. The role of the nonlinearity for the birth of persistent oscillations is fully discussed in the existing literature. On the other hand, not much of the role of the delay has yet been revealed. The purpose of this paper is to show that the delay really matters. In the original framework of Goodwin [6], it is first demonstrated that there is a threshold value of the delay: limit cycles arise for smaller values than the threshold and so do sawtooth oscillations for larger values. In the extended framework in which a consumption or saving delay, in addition to the investment delay, is introduced, three main results are demonstrated under assumption of the identical length of investment and consumption delays. The dynamics with consumption delay is basically the same as that of the single delay model. Second, in the case of saving delay, the steady state can coexist with the stable and unstable limit cycles in the stable case. Third, in the unstable case, there is an interval of delay in which the limit cycle or the sawtooth oscillation emerges depending on the choice of the constant initial function.

  20. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  1. A framework for a teaching toolkit in entrepreneurship education.

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  2. OGSA Globus Toolkits evaluation activity at CERN

    CERN Document Server

    Chen, D; Foster, D; Kalyaev, V; Kryukov, A; Lamanna, M; Pose, V; Rocha, R; Wang, C

    2004-01-01

    An Open Grid Service Architecture (OGSA) Globus Toolkit 3 (GT3) evaluation group is active at CERN since GT3 was available in early beta version (Spring 2003). This activity focuses on the evaluation of the technology as promised by the OGSA/OGSI paradigm and on GT3 in particular. The goal is to study this new technology and its implications with the goal to provide useful input for the large grid initiatives active in the LHC Computing Grid (LCG) project. A particular effort has been devoted to investigate performance and deployment issues, having in mind the LCG requirements, in particular scalability and robustness.

  3. REST: a toolkit for resting-state functional magnetic resonance imaging data processing.

    Directory of Open Access Journals (Sweden)

    Xiao-Wei Song

    Full Text Available Resting-state fMRI (RS-fMRI has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST. REST was developed in MATLAB with graphical user interface (GUI. After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF, and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.. REST is an open-source package and is freely available at http://www.restfmri.net.

  4. A model for the determination of the nominal potential for a linear accelerator

    International Nuclear Information System (INIS)

    Gutt, F.; Silva, P.; Guerrero, R.; Diaz, J.; Colmenares, J.

    1998-01-01

    The objective of the present work is to find a physical mathematical model based on the reason of the dose percentages at 10 and 20 cm depth, at 100 cm DFS and a 10 x 10 cm 2 field. It was utilized literature data of new manufactured accelerators and those are in use in hospitals, which allow to prove the model under different conditions. Our objective consists only to obtain a model that verifies the nominal potential for a linear accelerator, but without pretending that such a model to be used to calculate any one factor to determination of absorbed dose. (Author)

  5. SimPackJ/S: a web-oriented toolkit for discrete event simulation

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2002-07-01

    SimPackJ/S is the JavaScript and Java version of SimPack, which means SimPackJ/S is a collection of JavaScript and Java libraries and executable programs for computer simulations. The main purpose of creating SimPackJ/S is that we allow existing SimPack users to expand simulation areas and provide future users with a freeware simulation toolkit to simulate and model a system in web environments. One of the goals for this paper is to introduce SimPackJ/S. The other goal is to propose translation rules for converting C to JavaScript and Java. Most parts demonstrate the translation rules with examples. In addition, we discuss a 3D dynamic system model and overview an approach to 3D dynamic systems using SimPackJ/S. We explain an interface between SimPackJ/S and the 3D language--Virtual Reality Modeling Language (VRML). This paper documents how to translate C to JavaScript and Java and how to utilize SimPackJ/S within a 3D web environment.

  6. A Teacher Tablet Toolkit to meet the challenges posed by 21st ...

    African Journals Online (AJOL)

    The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design ...

  7. Dosimetry applications in GATE Monte Carlo toolkit.

    Science.gov (United States)

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  9. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  10. Accelerated diffusion controlled creep of polycrystalline materials. Communication 1. Model of diffusion controlled creep acceleration

    International Nuclear Information System (INIS)

    Smirnova, E.S.; Chuvil'deev, V.N.

    1998-01-01

    The model is suggested which describes the influence of large-angle grain boundary migration on a diffusion controlled creep rate in polycrystalline materials (Coble creep). The model is based on the concept about changing the value of migrating boundary free volume when introducing dislocations distributed over the grain bulk into this boundary. Expressions are obtained to calculate the grain boundary diffusion coefficient under conditions of boundary migration and the parameter, which characterized the value of Coble creep acceleration. A comparison is made between calculated and experimental data for Cd, Co and Fe

  11. Object Toolkit Version 4.3 User’s Manual

    Science.gov (United States)

    2016-12-31

    and with Nascap-2k. See the EPIC and Nascap-2k manuals for instructions. Most of the difficulties that users have encountered with Object Toolkit are...4/icond). 12.3 Importing Components From a NX I-DEAS TMG ASCII VUFF File Users of the NX I-DEAS TMG thermal analysis program can import the ASCII...2k user interface. The meaning of these properties is discussed in the Nascap-2k User’s Manual . Figure 36. Detector Properties Dialog Box. 15.5

  12. The MOLGENIS toolkit : rapid prototyping of biosoftware at the push of a button

    NARCIS (Netherlands)

    Swertz, Morris A.; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K.; Kanterakis, Alexandros; Roos, Erik T.; Lops, Joris; Thorisson, Gudmundur A.; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J.; de Brock, Engbert O.; Jansen, Ritsert C.; Parkinson, Helen

    2010-01-01

    Background: There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly

  13. IChem: A Versatile Toolkit for Detecting, Comparing, and Predicting Protein-Ligand Interactions.

    Science.gov (United States)

    Da Silva, Franck; Desaphy, Jeremy; Rognan, Didier

    2018-03-20

    Structure-based ligand design requires an exact description of the topology of molecular entities under scrutiny. IChem is a software package that reflects the many contributions of our research group in this area over the last decade. It facilitates and automates many tasks (e.g., ligand/cofactor atom typing, identification of key water molecules) usually left to the modeler's choice. It therefore permits the detection of molecular interactions between two molecules in a very precise and flexible manner. Moreover, IChem enables the conversion of intricate three-dimensional (3D) molecular objects into simple representations (fingerprints, graphs) that facilitate knowledge acquisition at very high throughput. The toolkit is an ideal companion for setting up and performing many structure-based design computations. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Proceedings of the Military Operational Medicine Research Program Return to Duty (RTD) Toolkit Expert Panel Workshop, 16-17 February 2017

    Science.gov (United States)

    2017-07-10

    Laboratory 2Oak Ridge Institute for Science and Education United States Army Aeromedical Research Laboratory Aircrew Health and Performance Division...excluded from the RTD Toolkit Manual, 2) to identify any additional tasks and clinical assessments for inclusion in the RTD Toolkit Manual, and 3) to agree...for Science and Education through an interagency agreement between the U.S. Department of Energy and the U.S. Army Medical Research and Materiel

  15. Optics elements for modeling electrostatic lenses and accelerator components: III. Electrostatic deflectors

    International Nuclear Information System (INIS)

    Brown, T.A.; Gillespie, G.H.

    2000-01-01

    Ion-beam optics models for simulating electrostatic prisms (deflectors) of different geometries have been developed for the envelope (matrix) computer code TRACE 3-D as a part of the development of a suite of electrostatic beamline element models which includes lenses, acceleration columns, quadrupoles and prisms. The models for electrostatic prisms are described in this paper. The electrostatic prism model options allow the first-order modeling of cylindrical, spherical and toroidal electrostatic deflectors. The application of these models in the development of ion-beam transport systems is illustrated through the modeling of a spherical electrostatic analyzer as a component of the new low-energy beamline at the Center for Accelerator Mass Spectrometry. Although initial tests following installation of the new beamline showed that the new spherical electrostatic analyzer was not behaving as predicted by these first-order models, operational conditions were found under which the analyzer now works properly as a double-focusing spherical electrostatic prism

  16. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    International Nuclear Information System (INIS)

    Zheng, Wei; Zhang, Ming; Zhang, Jing; Zhuang, Ge

    2013-01-01

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system

  17. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, W; Mitaroff, W; Moser, F; Pflugfelder, B; Riedel, H V [Austrian Academy of Sciences, Institute of High Energy Physics, A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at

    2008-07-15

    A detector-independent toolkit for vertex reconstruction (RAVE{sup 1}) is being developed, along with a standalone framework (VERTIGO{sup 2}) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  18. Model-based security engineering for the internet of things

    OpenAIRE

    NEISSE RICARDO; STERI GARY; NAI FOVINO Igor; BALDINI Gianmarco; VAN HOESEL Lodewijk

    2015-01-01

    We propose in this chapter a Model-based Security Toolkit (SecKit) and methodology to address the control and protection of user data in the deployment of the Internet of Things (IoT). This toolkit takes a more general approach for security engineering including risk analysis, establishment of aspect-specific trust relationships, and enforceable security policies. We describe the integrated metamodels used in the toolkit and the accompanying security engineering methodology for IoT systems...

  19. Doing accelerator physics using SDDS, UNIX, and EPICS

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.; Sereno, N.

    1995-01-01

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinate the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization

  20. Needs assessment: blueprint for a nurse graduate orientation employer toolkit.

    Science.gov (United States)

    Cylke, Katherine

    2012-01-01

    Southern Nevada nurse employers are resistant to hiring new graduate nurses (NGNs) because of their difficulties in making the transition into the workplace. At the same time, employers consider nurse residencies cost-prohibitive. Therefore, an alternative strategy was developed to assist employers with increasing the effectiveness of existing NGN orientation programs. A needs assessment of NGNs, employers, and nursing educators was completed, and the results were used to develop a toolkit for employers.

  1. Virtual Accelerator for Accelerator Optics Improvement

    CERN Document Server

    Yan Yi Ton; Decker, Franz Josef; Ecklund, Stanley; Irwin, John; Seeman, John; Sullivan, Michael K; Turner, J L; Wienands, Ulrich

    2005-01-01

    Through determination of all quadrupole strengths and sextupole feed-downs by fitting quantities derivable from precision orbit measurement, one can establish a virtual accelerator that matches the real accelerator optics. These quantities (the phase advances, the Green's functions, and the coupling eigen-plane ellipses tilt angles and axis ratios) are obtained by analyzing turn-by-turn Beam Position Monitor (BPM) data with a model-independent analysis (MIA). Instead of trying to identify magnet errors, a limited number of quadrupoles are chosen for optimized strength adjustment to improve the virtual accelerator optics and then applied to the real accelerator accordingly. These processes have been successfully applied to PEP-II rings for beta beating fixes, phase and working tune adjustments, and linear coupling reduction to improve PEP-II luminosity.

  2. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  3. Solving the quasi-static field model of the pulse-line accelerator; relationship to a circuit model

    International Nuclear Information System (INIS)

    Friedman, Alex

    2005-01-01

    The Pulse-Line Ion Accelerator (PLIA) is a promising approach to high-gradient acceleration of an ion beam at high line charge density. A recent note by R. J. Briggs suggests that a 'sheath helix' model of such a system can be solved numerically in the quasi-static limit. Such a model captures the correct macroscopic behavior from first principles without the need to time-advance the full Maxwell equations on a grid. This note describes numerical methods that may be used to effect such a solution, and their connection to the circuit model that was described in an earlier note by the author. Fine detail of the fields in the vicinity of the helix wires is not obtained by this approach, but for purposes of beam dynamics simulation such detail is not generally needed

  4. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    Science.gov (United States)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  5. Decision support toolkit for integrated analysis and design of reclaimed water infrastructure.

    Science.gov (United States)

    Lee, Eun Jung; Criddle, Craig S; Geza, Mengistu; Cath, Tzahi Y; Freyberg, David L

    2018-05-01

    Planning of water reuse systems is a complex endeavor. We have developed a software toolkit, IRIPT (Integrated Urban Reclaimed Water Infrastructure Planning Toolkit) that facilitates planning and design of reclaimed water infrastructure for both centralized and hybrid configurations that incorporate satellite treatment plants (STPs). The toolkit includes a Pipeline Designer (PRODOT) that optimizes routing and sizing of pipelines for wastewater capture and reclaimed water distribution, a Selector (SelWTP) that assembles and optimizes wastewater treatment trains, and a Calculator (CalcBenefit) that estimates fees, revenues, and subsidies of alternative designs. For hybrid configurations, a Locator (LocSTP) optimizes siting of STPs and associated wastewater diversions by identifying manhole locations where the flowrates are sufficient to ensure that wastewater extracted and treated at an adjacent STP can generate the revenue needed to pay for treatment and delivery to customers. Practical local constraints are also applied to screen and identify STP locations. Once suitable sites are selected, System Integrator (ToolIntegrator) identifies a set of centralized and hybrid configurations that: (1) maximize reclaimed water supply, (2) maximize reclaimed water supply while also ensuring a financial benefit for the system, and (3) maximize the net financial benefit for the system. The resulting configurations are then evaluated by an Analyst (SANNA) that uses monetary and non-monetary criteria, with weights assigned to appropriate metrics by a decision-maker, to identify a preferred configuration. To illustrate the structure, assumptions, and use of IRIPT, we apply it to a case study for the city of Golden, CO. The criteria weightings provided by a local decision-maker lead to a preference for a centralized configuration in this case. The Golden case study demonstrates that IRIPT can efficiently analyze centralized and hybrid water reuse configurations and rank them

  6. The Sense-It App: A Smartphone Sensor Toolkit for Citizen Inquiry Learning

    Science.gov (United States)

    Sharples, Mike; Aristeidou, Maria; Villasclaras-Fernández, Eloy; Herodotou, Christothea; Scanlon, Eileen

    2017-01-01

    The authors describe the design and formative evaluation of a sensor toolkit for Android smartphones and tablets that supports inquiry-based science learning. The Sense-it app enables a user to access all the motion, environmental and position sensors available on a device, linking these to a website for shared crowd-sourced investigations. The…

  7. BIT: Biosignal Igniter Toolkit.

    Science.gov (United States)

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. A software toolkit for implementing low-cost virtual reality training systems

    International Nuclear Information System (INIS)

    Louka, Michael N.

    1999-04-01

    VR is a powerful technology for implementing training systems but better tools are needed to achieve wider usage and acceptance for desktop computer-based training applications. A need has been identified for a software tool kit to support the efficient implementation of well-structured desktop VR training systems. A powerful toolkit for implementing scalable low-cost VR training applications is described in this report (author) (ml)

  9. Model- and calibration-independent test of cosmic acceleration

    International Nuclear Information System (INIS)

    Seikel, Marina; Schwarz, Dominik J.

    2009-01-01

    We present a calibration-independent test of the accelerated expansion of the universe using supernova type Ia data. The test is also model-independent in the sense that no assumptions about the content of the universe or about the parameterization of the deceleration parameter are made and that it does not assume any dynamical equations of motion. Yet, the test assumes the universe and the distribution of supernovae to be statistically homogeneous and isotropic. A significant reduction of systematic effects, as compared to our previous, calibration-dependent test, is achieved. Accelerated expansion is detected at significant level (4.3σ in the 2007 Gold sample, 7.2σ in the 2008 Union sample) if the universe is spatially flat. This result depends, however, crucially on supernovae with a redshift smaller than 0.1, for which the assumption of statistical isotropy and homogeneity is less well established

  10. Target normal sheath acceleration analytical modeling, comparative study and developments

    International Nuclear Information System (INIS)

    Perego, C.; Batani, D.; Zani, A.; Passoni, M.

    2012-01-01

    Ultra-intense laser interaction with solid targets appears to be an extremely promising technique to accelerate ions up to several MeV, producing beams that exhibit interesting properties for many foreseen applications. Nowadays, most of all the published experimental results can be theoretically explained in the framework of the target normal sheath acceleration (TNSA) mechanism proposed by Wilks et al. [Phys. Plasmas 8(2), 542 (2001)]. As an alternative to numerical simulation various analytical or semi-analytical TNSA models have been published in the latest years, each of them trying to provide predictions for some of the ion beam features, given the initial laser and target parameters. However, the problem of developing a reliable model for the TNSA process is still open, which is why the purpose of this work is to enlighten the present situation of TNSA modeling and experimental results, by means of a quantitative comparison between measurements and theoretical predictions of the maximum ion energy. Moreover, in the light of such an analysis, some indications for the future development of the model proposed by Passoni and Lontano [Phys. Plasmas 13(4), 042102 (2006)] are then presented.

  11. Modelling accelerated degradation data using Wiener diffusion with a time scale transformation.

    Science.gov (United States)

    Whitmore, G A; Schenkelberg, F

    1997-01-01

    Engineering degradation tests allow industry to assess the potential life span of long-life products that do not fail readily under accelerated conditions in life tests. A general statistical model is presented here for performance degradation of an item of equipment. The degradation process in the model is taken to be a Wiener diffusion process with a time scale transformation. The model incorporates Arrhenius extrapolation for high stress testing. The lifetime of an item is defined as the time until performance deteriorates to a specified failure threshold. The model can be used to predict the lifetime of an item or the extent of degradation of an item at a specified future time. Inference methods for the model parameters, based on accelerated degradation test data, are presented. The model and inference methods are illustrated with a case application involving self-regulating heating cables. The paper also discusses a number of practical issues encountered in applications.

  12. Jet browser model accelerated by GPUs

    Directory of Open Access Journals (Sweden)

    Forster Richárd

    2016-12-01

    Full Text Available In the last centuries the experimental particle physics began to develop thank to growing capacity of computers among others. It is allowed to know the structure of the matter to level of quark gluon. Plasma in the strong interaction. Experimental evidences supported the theory to measure the predicted results. Since its inception the researchers are interested in the track reconstruction. We studied the jet browser model, which was developed for 4π calorimeter. This method works on the measurement data set, which contain the components of interaction points in the detector space and it allows to examine the trajectory reconstruction of the final state particles. We keep the total energy in constant values and it satisfies the Gauss law. Using GPUs the evaluation of the model can be drastically accelerated, as we were able to achieve up to 223 fold speedup compared to a CPU based parallel implementation.

  13. Does the Good Schools Toolkit Reduce Physical, Sexual and Emotional Violence, and Injuries, in Girls and Boys equally? A Cluster-Randomised Controlled Trial.

    Science.gov (United States)

    Devries, Karen M; Knight, Louise; Allen, Elizabeth; Parkes, Jenny; Kyegombe, Nambusi; Naker, Dipak

    2017-10-01

    We aimed to investigate whether the Good School Toolkit reduced emotional violence, severe physical violence, sexual violence and injuries from school staff to students, as well as emotional, physical and sexual violence between peers, in Ugandan primary schools. We performed a two-arm cluster randomised controlled trial with parallel assignment. Forty-two schools in one district were allocated to intervention (n = 21) or wait-list control (n = 21) arms in 2012. We did cross-sectional baseline and endline surveys in 2012 and 2014, and the Good School Toolkit intervention was implemented for 18 months between surveys. Analyses were by intention to treat and are adjusted for clustering within schools and for baseline school-level proportions of outcomes. The Toolkit was associated with an overall reduction in any form of violence from staff and/or peers in the past week towards both male (aOR = 0.34, 95%CI 0.22-0.53) and female students (aOR = 0.55, 95%CI 0.36-0.84). Injuries as a result of violence from school staff were also lower in male (aOR = 0.36, 95%CI 0.20-0.65) and female students (aOR = 0.51, 95%CI 0.29-0.90). Although the Toolkit seems to be effective at reducing violence in both sexes, there is some suggestion that the Toolkit may have stronger effects in boys than girls. The Toolkit is a promising intervention to reduce a wide range of different forms of violence from school staff and between peers in schools, and should be urgently considered for scale-up. Further research is needed to investigate how the intervention could engage more successfully with girls.

  14. The Python ARM Radar Toolkit (Py-ART), a Library for Working with Weather Radar Data in the Python Programming Language

    OpenAIRE

    Helmus, Jonathan J; Collis, Scott M

    2016-01-01

    The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cy...

  15. WING/WORLD: An Open Experimental Toolkit for the Design and Deployment of IEEE 802.11-Based Wireless Mesh Networks Testbeds

    Directory of Open Access Journals (Sweden)

    Daniele Miorandi

    2010-01-01

    Full Text Available Wireless Mesh Networks represent an interesting instance of light-infrastructure wireless networks. Due to their flexibility and resiliency to network failures, wireless mesh networks are particularly suitable for incremental and rapid deployments of wireless access networks in both metropolitan and rural areas. This paper illustrates the design and development of an open toolkit aimed at supporting the design of different solutions for wireless mesh networking by enabling real evaluation, validation, and demonstration. The resulting testbed is based on off-the-shelf hardware components and open-source software and is focused on IEEE 802.11 commodity devices. The software toolkit is based on an “open” philosophy and aims at providing the scientific community with a tool for effective and reproducible performance analysis of WMNs. The paper describes the architecture of the toolkit, and its core functionalities, as well as its potential evolutions.

  16. Local re-acceleration and a modified thick target model of solar flare electrons

    Science.gov (United States)

    Brown, J. C.; Turkmani, R.; Kontar, E. P.; MacKinnon, A. L.; Vlahos, L.

    2009-12-01

    Context: The collisional thick target model (CTTM) of solar hard X-ray (HXR) bursts has become an almost “standard model” of flare impulsive phase energy transport and radiation. However, it faces various problems in the light of recent data, particularly the high electron beam density and anisotropy it involves. Aims: We consider how photon yield per electron can be increased, and hence fast electron beam intensity requirements reduced, by local re-acceleration of fast electrons throughout the HXR source itself, after injection. Methods: We show parametrically that, if net re-acceleration rates due to e.g. waves or local current sheet electric (E) fields are a significant fraction of collisional loss rates, electron lifetimes, and hence the net radiative HXR output per electron can be substantially increased over the CTTM values. In this local re-acceleration thick target model (LRTTM) fast electron number requirements and anisotropy are thus reduced. One specific possible scenario involving such re-acceleration is discussed, viz, a current sheet cascade (CSC) in a randomly stressed magnetic loop. Results: Combined MHD and test particle simulations show that local E fields in CSCs can efficiently accelerate electrons in the corona and and re-accelerate them after injection into the chromosphere. In this HXR source scenario, rapid synchronisation and variability of impulsive footpoint emissions can still occur since primary electron acceleration is in the high Alfvén speed corona with fast re-acceleration in chromospheric CSCs. It is also consistent with the energy-dependent time-of-flight delays in HXR features. Conclusions: Including electron re-acceleration in the HXR source allows an LRTTM modification of the CTTM in which beam density and anisotropy are much reduced, and alleviates theoretical problems with the CTTM, while making it more compatible with radio and interplanetary electron numbers. The LRTTM is, however, different in some respects such as

  17. TMVA(Toolkit for Multivariate Analysis) new architectures design and implementation.

    CERN Document Server

    Zapata Mesa, Omar Andres

    2016-01-01

    Toolkit for Multivariate Analysis(TMVA) is a package in ROOT for machine learning algorithms for classification and regression of the events in the detectors. In TMVA, we are developing new high level algorithms to perform multivariate analysis as cross validation, hyper parameter optimization, variable importance etc... Almost all the algorithms are expensive and designed to process a huge amount of data. It is very important to implement the new technologies on parallel computing to reduce the processing times.

  18. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    Science.gov (United States)

    1998-01-01

    devices are exoskeletal in nature. They could be flexible, such as a glove or a suit worn by the user, or they could be rigid, such as jointed linkages...effectiveness of using force control need to be investigated. The MAGIC Toolkit can be used to develop sensory tasks for rehabilitative medicine...display. Proceedings of IEEE Conference on Robotics and Automation, San Diego, CA, May 1994. [4] J. E. Colgate, P. E. Grafing, and M. C. Stanley

  19. Acceleration Modes and Transitions in Pulsed Plasma Accelerators

    Science.gov (United States)

    Polzin, Kurt A.; Greve, Christine M.

    2018-01-01

    accelerators was developed by Cheng, et al. The Coaxial High ENerGy (CHENG) thruster operated on the 10-microseconds timescales of pulsed plasma thrusters, but claimed high thrust density, high efficiency and low electrode erosion rates, which are more consistent with the deflagration mode of acceleration. Separate work on gas-fed pulsed plasma thrusters (PPTs) by Ziemer, et al. identified two separate regimes of performance. The regime at higher mass bits (termed Mode I in that work) possessed relatively constant thrust efficiency (ratio of jet kinetic energy to input electrical energy) as a function of mass bit. In the second regime at very low mass bits (termed Mode II), the efficiency increased with decreasing mass bit. Work by Poehlmann et al. and by Sitaraman and Raja sought to understand the performance of the CHENG thruster and the Mode I / Mode II performance in PPTs by modeling the acceleration using the Hugoniot Relation, with the detonation and deflagration modes representing two distinct sets of solutions to the relevant conservation laws. These works studied the proposal that, depending upon the values of the various controllable parameters, the accelerator would operate in either the detonation or deflagration mode. In the present work, we propose a variation on the explanation for the differences in performance between the various pulsed plasma accelerators. Instead of treating the accelerator as if it were only operating in one mode or the other during a pulse, we model the initial stage of the discharge in all cases as an accelerating current sheet (detonation mode). If the current sheet reaches the exit of the accelerator before the discharge is completed, the acceleration mode transitions to the deflagration mode type found in the quasi-steady MPD thrusters. This modeling method is used to demonstrate that standard gas-fed pulsed plasma accelerators, the CHENG thruster, and the quasi-steady MPD accelerator are variations of the same device, with the overall

  20. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    Science.gov (United States)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  1. Supporting safe driving with arthritis: developing a driving toolkit for clinical practice and consumer use.

    Science.gov (United States)

    Vrkljan, Brenda H; Cranney, Ann; Worswick, Julia; O'Donnell, Siobhan; Li, Linda C; Gélinas, Isabelle; Byszewski, Anna; Man-Son-Hing, Malcolm; Marshall, Shawn

    2010-01-01

    We conducted a series of focus groups to explore the information needs of clinicians and consumers related to arthritis and driving. An open coding analysis identified common themes across both consumer and clinician-based focus groups that underscored the importance of addressing driving-related concerns and the challenges associated with assessing safety. The results revealed that although driving is critical for maintaining independence and community mobility, drivers with arthritis experience several problems that can affect safe operation of a motor vehicle. Findings from this study are part of a broader research initiative that will inform the development of the Arthritis and Driving toolkit. This toolkit outlines strategies to support safe mobility for people with arthritis and will be an important resource in the coming years given the aging population.

  2. A new model for diffuse brain injury by rotational acceleration: I model, gross appearance, and astrocytosis.

    Science.gov (United States)

    Gutierrez, E; Huang, Y; Haglid, K; Bao, F; Hansson, H A; Hamberger, A; Viano, D

    2001-03-01

    Rapid head rotation is a major cause of brain damage in automobile crashes and falls. This report details a new model for rotational acceleration about the center of mass of the rabbit head. This allows the study of brain injury without translational acceleration of the head. Impact from a pneumatic cylinder was transferred to the skull surface to cause a half-sine peak acceleration of 2.1 x 10(5) rad/s2 and 0.96-ms pulse duration. Extensive subarachnoid hemorrhages and small focal bleedings were observed in the brain tissue. A pronounced reactive astrogliosis was found 8-14 days after trauma, both as networks around the focal hemorrhages and more diffusely in several brain regions. Astrocytosis was prominent in the gray matter of the cerebral cortex, layers II-V, and in the granule cell layer and around the axons of the pyramidal neurons in the hippocampus. The nuclei of cranial nerves, such as the hypoglossal and facial nerves, also showed intense astrocytosis. The new model allows study of brain injuries from head rotation in the absence of translational influences.

  3. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    International Nuclear Information System (INIS)

    Ammazzalorso, F; Jelen, U; Bednarz, T

    2014-01-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  4. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    Science.gov (United States)

    Ammazzalorso, F.; Bednarz, T.; Jelen, U.

    2014-03-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  5. Theoretical temperature model with experimental validation for CLIC Accelerating Structures

    CERN Document Server

    AUTHOR|(CDS)2126138; Vamvakas, Alex; Alme, Johan

    Micron level stability of the Compact Linear Collider (CLIC) components is one of the main requirements to meet the luminosity goal for the future $48 \\,km$ long underground linear accelerator. The radio frequency (RF) power used for beam acceleration causes heat generation within the aligned structures, resulting in mechanical movements and structural deformations. A dedicated control of the air- and water- cooling system in the tunnel is therefore crucial to improve alignment accuracy. This thesis investigates the thermo-mechanical behavior of the CLIC Accelerating Structure (AS). In CLIC, the AS must be aligned to a precision of $10\\,\\mu m$. The thesis shows that a relatively simple theoretical model can be used within reasonable accuracy to predict the temperature response of an AS as a function of the applied RF power. During failure scenarios or maintenance interventions, the RF power is turned off resulting in no heat dissipation and decrease in the overall temperature of the components. The theoretica...

  6. Verification of mechanistic-empirical design models for flexible pavements through accelerated pavement testing : technical summary.

    Science.gov (United States)

    2014-08-01

    Midwest States Accelerated Pavement Testing Pooled-Fund Program, financed by the : highway departments of Kansas, Iowa, and Missouri, has supported an accelerated : pavement testing (APT) project to validate several models incorporated in the NCHRP :...

  7. Between structures and norms : Assessing tax increment financing for the Dutch spatial planning toolkit

    NARCIS (Netherlands)

    Root, Liz; Van Der Krabben, Erwin; Spit, Tejo

    2015-01-01

    The aim of the paper is to assess the institutional (mis)fit of tax increment financing for the Dutch spatial planning financial toolkit. By applying an institutionally oriented assessment framework, we analyse the interconnectivity of Dutch municipal finance and spatial planning structures and

  8. JACoW Decoupling CERN accelerators

    CERN Document Server

    Dworak, Andrzej

    2018-01-01

    The accelerator complex at CERN is a living system. Accelerators are being dismantled, upgraded or change their purpose. New accelerators are built. The changes do not happen overnight, but when they happen they may require profound changes across the handling systems. Central timings (CT), responsible for sequencing and synchronization of accelerators, are good examples of such systems. This paper shows how over the past twenty years the changes and new requirements influenced the evolution of the CTs. It describes experience gained from using the Central Beam and Cycle Manager (CBCM) CT model, for strongly coupled accelerators, and how it led to a design of a new Dynamic Beam Negotiation (DBN) model for the AD and ELENA accelerators, which reduces the coupling, increasing accelerator independence. The paper ends with an idea how to merge strong points of both models in order to create a single generic system able to efficiently handle all CERN accelerators and provide more beam time to experiments and LHC.

  9. Accelerating transient simulation of linear reduced order models.

    Energy Technology Data Exchange (ETDEWEB)

    Thornquist, Heidi K.; Mei, Ting; Keiter, Eric Richard; Bond, Brad

    2011-10-01

    Model order reduction (MOR) techniques have been used to facilitate the analysis of dynamical systems for many years. Although existing model reduction techniques are capable of providing huge speedups in the frequency domain analysis (i.e. AC response) of linear systems, such speedups are often not obtained when performing transient analysis on the systems, particularly when coupled with other circuit components. Reduced system size, which is the ostensible goal of MOR methods, is often insufficient to improve transient simulation speed on realistic circuit problems. It can be shown that making the correct reduced order model (ROM) implementation choices is crucial to the practical application of MOR methods. In this report we investigate methods for accelerating the simulation of circuits containing ROM blocks using the circuit simulator Xyce.

  10. Modeling of inverse Cherenkov laser acceleration with axicon laser-beam focusing

    International Nuclear Information System (INIS)

    Romea, R.D.; Kimura, W.D.

    1990-01-01

    Acceleration of free electrons by the inverse Cherenkov effect using radially polarized laser light focused through an axicon [J. P. Fontana and R. H. Pantell, J. Appl. Phys. 54, 4285 (1983)] has been studied utilizing a Monte Carlo computer simulation and further theoretical analysis. The model includes effects, such as scattering of the electrons by the gas, and diffraction and interference effects of the axicon laser beam, that were not included in the original analysis of Fontana and Pantell. Its accuracy is validated using available experimental data. The model results show that effective acceleration is possible even with the effects of scattering. Sample results are given. The analysis includes examining the issues of axicon focusing, phase errors, energy gain, phase slippage, focusing of the e beam, and emittance growth

  11. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    Directory of Open Access Journals (Sweden)

    Lerendegui-Marco J.

    2017-01-01

    Full Text Available Monte Carlo (MC simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1, especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2 of the facility.

  12. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    Science.gov (United States)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  13. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    Science.gov (United States)

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  14. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  15. Clinical Trial of a Home Safety Toolkit for Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Kathy J. Horvath

    2013-01-01

    Full Text Available This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n=60 received the Home Safety Toolkit (HST, including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n=48 received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P≤0.001, caregiver strain at P≤0.001, and caregiver self-efficacy at P=0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P≤0.001. The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care.

  16. Economic evaluation of the Good School Toolkit: an intervention for reducing violence in primary schools in Uganda.

    Science.gov (United States)

    Greco, Giulia; Knight, Louise; Ssekadde, Willington; Namy, Sophie; Naker, Dipak; Devries, Karen

    2018-01-01

    This paper presents the cost and cost-effectiveness of the Good School Toolkit (GST), a programme aimed at reducing physical violence perpetrated by school staff to students in Uganda. The effectiveness of the Toolkit was tested with a cluster randomised controlled trial in 42 primary schools in Luwero District, Uganda. A full economic costing evaluation and cost-effectiveness analysis were conducted alongside the trial. Both financial and economic costs were collected retrospectively from the provider's perspective to estimate total and unit costs. The total cost of setting up and running the Toolkit over the 18-month trial period is estimated at US$397 233, excluding process monitor (M&E) activities. The cost to run the intervention is US$7429 per school annually, or US$15 per primary school pupil annually, in the trial intervention schools. It is estimated that the intervention has averted 1620 cases of past-week physical violence during the 18-month implementation period. The total cost per case of violence averted is US$244, and the annual implementation cost is US$96 per case averted during the trial. The GST is a cost-effective intervention for reducing violence against pupils in primary schools in Uganda. It compares favourably against other violence reduction interventions in the region.

  17. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  18. An artificial neural network model of energy expenditure using nonintegrated acceleration signals.

    Science.gov (United States)

    Rothney, Megan P; Neumann, Megan; Béziat, Ashley; Chen, Kong Y

    2007-10-01

    Accelerometers are a promising tool for characterizing physical activity patterns in free living. The major limitation in their widespread use to date has been a lack of precision in estimating energy expenditure (EE), which may be attributed to the oversimplified time-integrated acceleration signals and subsequent use of linear regression models for EE estimation. In this study, we collected biaxial raw (32 Hz) acceleration signals at the hip to develop a relationship between acceleration and minute-to-minute EE in 102 healthy adults using EE data collected for nearly 24 h in a room calorimeter as the reference standard. From each 1 min of acceleration data, we extracted 10 signal characteristics (features) that we felt had the potential to characterize EE intensity. Using these data, we developed a feed-forward/back-propagation artificial neural network (ANN) model with one hidden layer (12 x 20 x 1 nodes). Results of the ANN were compared with estimations using the ActiGraph monitor, a uniaxial accelerometer, and the IDEEA monitor, an array of five accelerometers. After training and validation (leave-one-subject out) were completed, the ANN showed significantly reduced mean absolute errors (0.29 +/- 0.10 kcal/min), mean squared errors (0.23 +/- 0.14 kcal(2)/min(2)), and difference in total EE (21 +/- 115 kcal/day), compared with both the IDEEA (P types under free-living conditions.

  19. Nonlinear Monte Carlo model of superdiffusive shock acceleration with magnetic field amplification

    Science.gov (United States)

    Bykov, Andrei M.; Ellison, Donald C.; Osipov, Sergei M.

    2017-03-01

    Fast collisionless shocks in cosmic plasmas convert their kinetic energy flow into the hot downstream thermal plasma with a substantial fraction of energy going into a broad spectrum of superthermal charged particles and magnetic fluctuations. The superthermal particles can penetrate into the shock upstream region producing an extended shock precursor. The cold upstream plasma flow is decelerated by the force provided by the superthermal particle pressure gradient. In high Mach number collisionless shocks, efficient particle acceleration is likely coupled with turbulent magnetic field amplification (MFA) generated by the anisotropic distribution of accelerated particles. This anisotropy is determined by fast particle transport, making the problem strongly nonlinear and multiscale. Here, we present a nonlinear Monte Carlo model of collisionless shock structure with superdiffusive propagation of high-energy Fermi accelerated particles coupled to particle acceleration and MFA, which affords a consistent description of strong shocks. A distinctive feature of the Monte Carlo technique is that it includes the full angular anisotropy of the particle distribution at all precursor positions. The model reveals that the superdiffusive transport of energetic particles (i.e., Lévy-walk propagation) generates a strong quadruple anisotropy in the precursor particle distribution. The resultant pressure anisotropy of the high-energy particles produces a nonresonant mirror-type instability that amplifies compressible wave modes with wavelengths longer than the gyroradii of the highest-energy protons produced by the shock.

  20. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Directory of Open Access Journals (Sweden)

    K Anderson

    Full Text Available This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app', so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping.

  1. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Science.gov (United States)

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  2. Java advanced medical image toolkit

    International Nuclear Information System (INIS)

    Saunder, T.H.C.; O'Keefe, G.J.; Scott, A.M.

    2002-01-01

    Full text: The Java Advanced Medical Image Toolkit (jAMIT) has been developed at the Center for PET and Department of Nuclear Medicine in an effort to provide a suite of tools that can be utilised in applications required to perform analysis, processing and visualisation of medical images. jAMIT uses Java Advanced Imaging (JAI) to combine the platform independent nature of Java with the speed benefits associated with native code. The object-orientated nature of Java allows the production of an extensible and robust package which is easily maintained. In addition to jAMIT, a Medical Image VO API called Sushi has been developed to provide access to many commonly used image formats. These include DICOM, Analyze, MINC/NetCDF, Trionix, Beat 6.4, Interfile 3.2/3.3 and Odyssey. This allows jAMIT to access data and study information contained in different medical image formats transparently. Additional formats can be added at any time without any modification to the jAMIT package. Tools available in jAMIT include 2D ROI Analysis, Palette Thresholding, Image Groping, Image Transposition, Scaling, Maximum Intensity Projection, Image Fusion, Image Annotation and Format Conversion. Future tools may include 2D Linear and Non-linear Registration, PET SUV Calculation, 3D Rendering and 3D ROI Analysis. Applications currently using JAMIT include Antibody Dosimetry Analysis, Mean Hemispheric Blood Flow Analysis, QuickViewing of PET Studies for Clinical Training, Pharamcodynamic Modelling based on Planar Imaging, and Medical Image Format Conversion. The use of jAMIT and Sushi for scripting and analysis in Matlab v6.1 and Jython is currently being explored. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  3. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  4. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...

  5. The Multiple-Patient Simulation Toolkit: Purpose, Process, and Pilot.

    Science.gov (United States)

    Beroz, Sabrina; Sullivan, Nancy; Kramasz, Vanessa; Morgan, Patricia

    Educating nursing students to safely care for multiple patients has become an important but challenging focus for nurse educators. New graduate nurses are expected to manage care for multiple patients in a complex and multifaceted health care system. With patient safety as a priority, multiple-patient assignments are necessary in order for nursing students to learn how to effectively prioritize and delegate care. The purpose of this project was the construction of an adaptable and flexible template for the development of multiple-patient simulations. Through utilization, the template moved to a toolkit adding an operational guide, sample-populated template, and bibliography.

  6. Flame acceleration of hydrogen - air - diluent mixtures at middle scale using ENACCEF: experiments and modelling

    International Nuclear Information System (INIS)

    Fabrice Malet; Nathalie Lamoureux; Nabiha Djebaili-Chaumeix; Claude-Etienne Paillard; Pierre Pailhories; Jean-Pierre L'heriteau; Bernard Chaumont; Ahmed Bentaib

    2005-01-01

    Full text of publication follows: In the case of hypothetic severe accident on light water nuclear reactor, hydrogen would be produced during reactor core degradation and released to the reactor building which could subsequently raise a combustion hazard. A local ignition of the combustible mixture would give birth initially to a slow flame which can be accelerated due to turbulence. Depending on the geometry and the premixed combustible mixture composition, the flame can accelerate and for some conditions transit to detonation or be quenched after a certain distance. The flame acceleration is responsible for the generation of high pressure loads that could damage the reactor's building. Moreover, geometrical configuration is a major factor leading to flame acceleration. Thus, recording experimental data notably on mid-size installations is required for the numeric simulations validation before modelling realistic scales. The ENACCEF vertical facility is a 6 meters high acceleration tube aimed at representing steam generator room leading to containment dome. This setup can be equipped with obstacles of different blockage ratios and shapes in order to obtain an acceleration of the flame. Depending on the geometrical characteristics of these obstacles, different regimes of the flame propagation can be achieved. The mixture composition's influence on flame velocity and acceleration has been investigated. Using a steam physical-like diluent (40% He - 60% CO 2 ), influence of dilution on flame speed and acceleration has been investigated. The flame front has also been recorded with ultra fast ombroscopy visualization, both in the tube and in dome's the entering. The flame propagation is computed using the TONUS code. Based on Euler's equation solving code using structured finite volumes, it includes the CREBCOM flames modelling and simulates the hydrogen/air turbulent flame propagation, taking into account 3D complex geometry and reactants concentration gradients. Since

  7. The Populist Toolkit : Finnish Populism in Action 2007–2016

    OpenAIRE

    Ylä-Anttila, Tuukka

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  8. Particle acceleration in pulsar magnetospheres

    International Nuclear Information System (INIS)

    Baker, K.B.

    1978-10-01

    The structure of pulsar magnetospheres and the acceleration mechanism for charged particles in the magnetosphere was studied, using a pulsar model which required large acceleration of the particles near the surface of the star. A theorem was developed which showed that particle acceleration cannot be expected when the angle between the magnetic field lines and the rotation axis is constant (e.g. radial field lines). If this angle is not constant, however, acceleration must occur. The more realistic model of an axisymmetric neutron star with a strong dipole magnetic field aligned with the rotation axis was investigated. In this case, acceleration occurred at large distances from the surface of the star. The magnitude of the current can be determined using the model presented. In the case of nonaxisymmetric systems, the acceleration is expected to occur nearer to the surface of the star

  9. The interactive learning toolkit: technology and the classroom

    Science.gov (United States)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  10. Muon acceleration in cosmic-ray sources

    International Nuclear Information System (INIS)

    Klein, Spencer R.; Mikkelsen, Rune E.; Becker Tjus, Julia

    2013-01-01

    Many models of ultra-high energy cosmic-ray production involve acceleration in linear accelerators located in gamma-ray bursts, magnetars, or other sources. These transient sources have short lifetimes, which necessitate very high accelerating gradients, up to 10 13 keV cm –1 . At gradients above 1.6 keV cm –1 , muons produced by hadronic interactions undergo significant acceleration before they decay. This muon acceleration hardens the neutrino energy spectrum and greatly increases the high-energy neutrino flux. Using the IceCube high-energy diffuse neutrino flux limits, we set two-dimensional limits on the source opacity and matter density, as a function of accelerating gradient. These limits put strong constraints on different models of particle acceleration, particularly those based on plasma wake-field acceleration, and limit models for sources like gamma-ray bursts and magnetars.

  11. The Fluka Linebuilder and Element Database: Tools for Building Complex Models of Accelerators Beam Lines

    CERN Document Server

    Mereghetti, A; Cerutti, F; Versaci, R; Vlachoudis, V

    2012-01-01

    Extended FLUKA models of accelerator beam lines can be extremely complex: heavy to manipulate, poorly versatile and prone to mismatched positioning. We developed a framework capable of creating the FLUKA model of an arbitrary portion of a given accelerator, starting from the optics configuration and a few other information provided by the user. The framework includes a builder (LineBuilder), an element database and a series of configuration and analysis scripts. The LineBuilder is a Python program aimed at dynamically assembling complex FLUKA models of accelerator beam lines: positions, magnetic fields and scorings are automatically set up, and geometry details such as apertures of collimators, tilting and misalignment of elements, beam pipes and tunnel geometries can be entered at user’s will. The element database (FEDB) is a collection of detailed FLUKA geometry models of machine elements. This framework has been widely used for recent LHC and SPS beam-machine interaction studies at CERN, and led to a dra...

  12. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    International Nuclear Information System (INIS)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J; Perl, J; Piersimoni, P; Ramos-Mendez, J; Faddegon, B

    2016-01-01

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  13. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J [Massachusetts General Hospital & Harvard Med. School, Boston, MA (United States); Perl, J [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Piersimoni, P; Ramos-Mendez, J; Faddegon, B [University of California, San Francisco, San Francisco, CA (United States)

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  14. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    Science.gov (United States)

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  15. Effect of an educational toolkit on quality of care: a pragmatic cluster randomized trial.

    Science.gov (United States)

    Shah, Baiju R; Bhattacharyya, Onil; Yu, Catherine H Y; Mamdani, Muhammad M; Parsons, Janet A; Straus, Sharon E; Zwarenstein, Merrick

    2014-02-01

    Printed educational materials for clinician education are one of the most commonly used approaches for quality improvement. The objective of this pragmatic cluster randomized trial was to evaluate the effectiveness of an educational toolkit focusing on cardiovascular disease screening and risk reduction in people with diabetes. All 933,789 people aged ≥40 years with diagnosed diabetes in Ontario, Canada were studied using population-level administrative databases, with additional clinical outcome data collected from a random sample of 1,592 high risk patients. Family practices were randomly assigned to receive the educational toolkit in June 2009 (intervention group) or May 2010 (control group). The primary outcome in the administrative data study, death or non-fatal myocardial infarction, occurred in 11,736 (2.5%) patients in the intervention group and 11,536 (2.5%) in the control group (p = 0.77). The primary outcome in the clinical data study, use of a statin, occurred in 700 (88.1%) patients in the intervention group and 725 (90.1%) in the control group (p = 0.26). Pre-specified secondary outcomes, including other clinical events, processes of care, and measures of risk factor control, were also not improved by the intervention. A limitation is the high baseline rate of statin prescribing in this population. The educational toolkit did not improve quality of care or cardiovascular outcomes in a population with diabetes. Despite being relatively easy and inexpensive to implement, printed educational materials were not effective. The study highlights the need for a rigorous and scientifically based approach to the development, dissemination, and evaluation of quality improvement interventions. http://www.ClinicalTrials.gov NCT01411865 and NCT01026688.

  16. PyCDT: A Python toolkit for modeling point defects in semiconductors and insulators

    Science.gov (United States)

    Broberg, Danny; Medasani, Bharat; Zimmermann, Nils E. R.; Yu, Guodong; Canning, Andrew; Haranczyk, Maciej; Asta, Mark; Hautier, Geoffroy

    2018-05-01

    Point defects have a strong impact on the performance of semiconductor and insulator materials used in technological applications, spanning microelectronics to energy conversion and storage. The nature of the dominant defect types, how they vary with processing conditions, and their impact on materials properties are central aspects that determine the performance of a material in a certain application. This information is, however, difficult to access directly from experimental measurements. Consequently, computational methods, based on electronic density functional theory (DFT), have found widespread use in the calculation of point-defect properties. Here we have developed the Python Charged Defect Toolkit (PyCDT) to expedite the setup and post-processing of defect calculations with widely used DFT software. PyCDT has a user-friendly command-line interface and provides a direct interface with the Materials Project database. This allows for setting up many charged defect calculations for any material of interest, as well as post-processing and applying state-of-the-art electrostatic correction terms. Our paper serves as a documentation for PyCDT, and demonstrates its use in an application to the well-studied GaAs compound semiconductor. We anticipate that the PyCDT code will be useful as a framework for undertaking readily reproducible calculations of charged point-defect properties, and that it will provide a foundation for automated, high-throughput calculations.

  17. Development of a toolkit in Silverlight to detect physiognomies in real time - doi: 10.4025/actascitechnol.v35i1.13713

    Directory of Open Access Journals (Sweden)

    Marcelo Cabral Ghilardi

    2013-01-01

    Full Text Available Security demands, forensic practices and the identification of criminals require the detection and recognition of iris and fingerprints and of faces in videos and photographs. Moreover, there is an increasing need for multiple forms of human-machine interaction. Control devices by body stimulus are a need and a trend. For example, currently some computers, laptops, phones and video games provide interaction from their cameras, not only for face detection but also for body movements and the detection of objects. Most devices are Internet accessed, which creates an even greater range of possibilities. These technological trends have prompted the development a toolkit for detecting faces in real time. The choice of Silverlight framework for the development of this toolkit provides these applications with instruments that could be implemented in a web browser. This toolkit may be used for other purposes, such as face and iris recognition, body movement and the monitoring of premises. An application was developed as example and proof of concept.  

  18. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    Science.gov (United States)

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  19. SwingStates: Adding state machines to Java and the Swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2008-01-01

    International audience; This article describes SwingStates, a Java toolkit designed to facilitate the development of graphical user interfaces and bring advanced interaction techniques to the Java platform. SwingStates is based on the use of finite-state machines specified directly in Java to describe the behavior of interactive systems. State machines can be used to redefine the behavior of existing Swing widgets or, in combination with a new canvas widget that features a rich graphical mode...

  20. A simulation toolkit for electroluminescence assessment in rare event experiments

    CERN Document Server

    Oliveira, C A B; Veenhof, R; Biagi, S; Monteiro, C M B; Santos, J M F dos; Ferreira, A L; Veloso, J F C A

    2011-01-01

    A good understanding of electroluminescence is a prerequisite when optimising double-phase noble gas detectors for Dark Matter searches and high-pressure xenon TPCs for neutrinoless double beta decay detection. A simulation toolkit for calculating the emission of light through electron impact on neon, argon, krypton and xenon has been developed using the Magboltz and Garfield programs. Calculated excitation and electroluminescence efficiencies, electroluminescence yield and associated statistical fluctuations are presented as a function of electric field. Good agreement with experiment and with Monte Carlo simulations has been obtained.

  1. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  2. Critical system issues and modeling requirements - the problem of beam energy sweep in an electron linear induction accelerator

    International Nuclear Information System (INIS)

    Turner, W.C.; Barrett, D.M.; Sampayan, S.E.

    1991-01-01

    In this paper the authors attempt to motivate the development of modeling tools for linear induction accelerator components by giving examples of performance limitations related to energy sweep. The most pressing issues is the development of an accurate model of the switching behavior of large magnetic cores at high dB/dt in the accelerator and magnetic compression modulators. Ideally one would like to have a model with as few parameters as possible that allows the user to choose the core geometry and magnetic material and perhaps a few parameters characterizing the switch model. Beyond this, the critical modeling tasks are: simulation of a magnetic compression modulator, modeling the reset dynamics of a magnetic compression modulator, modeling the loading characteristics of a linear induction accelerator cell, and modeling the electron injector current including the dynamics of feedback modulation and beam loading in an accelerator cell. Of course in the development of these models care should be given to benchmarking them against data from experimental systems. Beyond that one should aim for tools that have predictive power so that they can be used as design tools and not merely to replicate existing data

  3. Fourier-accelerated Langevin simulation of the frustrated XY model and simulation of the spinless and spin one-half Hubbard model

    International Nuclear Information System (INIS)

    Scheinine, A.L.

    1992-01-01

    The frustrated XY model was studied on a lattice, primarily to test Fourier transform acceleration technique for a phase transition having more field structure than just spinwaves and vortices. Also, the spinless Hubbard model without hopping was simulated using continuous variables for the auxiliary field that mediates coupling between fermions. Finally, spin one-half Hubbard model was studied with a technique that sampled the fermion occupation configurations. The frustrated two-dimensional XY model was simulated using the Langevin equation with Fourier transform acceleration. Speedup due to Fourier acceleration was measured for frustration one-half at the transition temperature. The unfrustrated XY model was also studied. For the frustrated case, only long-distance spin correlation and the autocorrelation of the spin showed significant speedup. The frustrated case has Ising-like domains. It was found that Fourier acceleration speeds the evolution of spinwaves but has negligible effect on the Ising-like domains. In the Hubbard model, fermion determinant weight factor in the partition function changes sign, causing large statistical fluctuations of observables. A technique was found for sampling configuration space using continuous auxiliary fields, despite energy barriers where the fermion determinant changes sign. For two-dimensional spinless Hubbard model with no hopping, an exact solution was found for a 4 x 4 lattice; which could be compared to numerical simulations. The sign problem remained, and was found to be related to the sign problem encountered when a discrete variable is used for the auxiliary field. For spin one-half Hubbard model, a Monte Carlo simulation was done in which the fermion occupation configurations were varied. Rather than integrate-out the fermions and make a numerical estimate of the sum over the auxiliary field, the auxiliary field was integrated-out and a numerical estimate was made of the sum over fermion configurations

  4. Unified accelerator libraries

    International Nuclear Information System (INIS)

    Malitsky, Nikolay; Talman, Richard

    1997-01-01

    A 'Universal Accelerator Libraries' (UAL) environment is described. Its purpose is to facilitate program modularity and inter-program and inter-process communication among heterogeneous programs. The goal ultimately is to facilitate model-based control of accelerators

  5. An MCNP-based model for the evaluation of the photoneutron dose in high energy medical electron accelerators.

    Science.gov (United States)

    Carinou, Eleutheria; Stamatelatos, Ion Evangelos; Kamenopoulou, Vassiliki; Georgolopoulou, Paraskevi; Sandilos, Panayotis

    The development of a computational model for the treatment head of a medical electron accelerator (Elekta/Philips SL-18) by the Monte Carlo code mcnp-4C2 is discussed. The model includes the major components of the accelerator head and a pmma phantom representing the patient body. Calculations were performed for a 14 MeV electron beam impinging on the accelerator target and a 10 cmx10 cm beam area at the isocentre. The model was used in order to predict the neutron ambient dose equivalent at the isocentre level and moreover the neutron absorbed dose distribution within the phantom. Calculations were validated against experimental measurements performed by gold foil activation detectors. The results of this study indicated that the equivalent dose at tissues or organs adjacent to the treatment field due to photoneutrons could be up to 10% of the total peripheral dose, for the specific accelerator characteristics examined. Therefore, photoneutrons should be taken into account when accurate dose calculations are required to sensitive tissues that are adjacent to the therapeutic X-ray beam. The method described can be extended to other accelerators and collimation configurations as well, upon specification of treatment head component dimensions, composition and nominal accelerating potential.

  6. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  7. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  8. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    Science.gov (United States)

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…

  9. Advanced processing and simulation of MRS data using the FID appliance (FID-A)-An open source, MATLAB-based toolkit.

    Science.gov (United States)

    Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie

    2017-01-01

    To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  10. Improving primary care for persons with spinal cord injury: Development of a toolkit to guide care.

    Science.gov (United States)

    Milligan, James; Lee, Joseph; Hillier, Loretta M; Slonim, Karen; Craven, Catharine

    2018-05-07

    To identify a set of essential components for primary care for patients with spinal cord injury (SCI) for inclusion in a point-of-practice toolkit for primary care practitioners (PCP) and identification of the essential elements of SCI care that are required in primary care and those that should be the focus of specialist care. Modified Delphi consensus process; survey methodology. Primary care. Three family physicians, six specialist physicians, and five inter-disciplinary health professionals completed surveys. Importance of care elements for inclusion in the toolkit (9-point scale: 1 = lowest level of importance, 9 = greatest level of importance) and identification of most responsible physician (family physician, specialist) for completing key categories of care. Open-ended comments were solicited. There was consensus between the respondent groups on the level of importance of various care elements. Mean importance scores were highest for autonomic dysreflexia, pain, and skin care and lowest for preventive care, social issues, and vital signs. Although, there was agreement across all respondents that family physicians should assume responsibility for assessing mental health, there was variability in who should be responsible for other care categories. Comments were related to the need for shared care approaches and capacity building and lack of knowledge and specialized equipment as barriers to optimal care. This study identified important components of SCI care to be included in a point-of-practice toolkit to facilitate primary care for persons with SCI.

  11. Modeling the Acceleration of Global Surface Temperture

    Science.gov (United States)

    Jones, B.

    2017-12-01

    A mathematical projection focusing on the changing rate of acceleration of Global Surface Temperatures. Using historical trajectory and informed expert near-term prediction, it is possible to extend this further forward drawing a reference arc of acceleration. Presented here is an example of this technique based on data found in the Summary of Findings of A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011 and that same team's stated prediction to 2050. With this, we can project a curve showing future acceleration: Decade (midpoint) Change in Global Land Temp Degrees C Known Slope Projected Trend 1755 0.000 1955 0.600 0.0030 2005 1.500 0.0051 2045 3.000 0.0375 2095 5.485 0.0497 2145 8.895 0.0682 2195 13.488 0.0919 Observations: Slopes are getting steeper and doing so faster in an "acceleration of the acceleration" or an "arc of acceleration". This is consistent with the non-linear accelerating feedback loops of global warming. Such projected temperatures threaten human civilization and human life. This `thumbnail' projection is consistent with the other long term predictions based on anthropogenic greenhouse gases. This projection is low when compared to those whose forecasts include greenhouse gases released from thawing permafrost and clathrate hydrates. A reference line: This curve should be considered a point of reference. In the near term and absent significant drawdown of greenhouse gases, my "bet" for this AGU session is that future temperatures will generally be above this reference curve. For example, the decade ending 2020 - more than 1.9C and the decade ending 2030 - more than 2.3C - again measured from the 1750 start point. *Caveat: The long term curve and prediction assumes that mankind does not move quickly away from high cost fossil fuels and does not invent, mobilize and take actions drawing down greenhouse gases. Those seeking a comprehensive action plan are directed to drawdown.org

  12. GPU-accelerated atmospheric chemical kinetics in the ECHAM/MESSy (EMAC) Earth system model (version 2.52)

    Science.gov (United States)

    Alvanos, Michail; Christoudias, Theodoros

    2017-10-01

    This paper presents an application of GPU accelerators in Earth system modeling. We focus on atmospheric chemical kinetics, one of the most computationally intensive tasks in climate-chemistry model simulations. We developed a software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in the global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC), used to study climate change and air quality scenarios. A source-to-source compiler outputs a CUDA-compatible kernel by parsing the FORTRAN code generated by the Kinetic PreProcessor (KPP) general analysis tool. All Rosenbrock methods that are available in the KPP numerical library are supported.Performance evaluation, using Fermi and Pascal CUDA-enabled GPU accelerators, shows achieved speed-ups of 4. 5 × and 20. 4 × , respectively, of the kernel execution time. A node-to-node real-world production performance comparison shows a 1. 75 × speed-up over the non-accelerated application using the KPP three-stage Rosenbrock solver. We provide a detailed description of the code optimizations used to improve the performance including memory optimizations, control code simplification, and reduction of idle time. The accuracy and correctness of the accelerated implementation are evaluated by comparing to the CPU-only code of the application. The median relative difference is found to be less than 0.000000001 % when comparing the output of the accelerated kernel the CPU-only code.The approach followed, including the computational workload division, and the developed GPU solver code can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications.

  13. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    Science.gov (United States)

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  14. Control system modelling for superconducting accelerator

    International Nuclear Information System (INIS)

    Czarski, T.; Pozniak, K.; Romaniuk, R.

    2006-01-01

    A digital control of superconducting cavities for a linear accelerator is presented. The LLRF - Low Level Radio Frequency system for FLASH project in DESY is introduced. FPGA based controller supported by MATLAB system was developed to investigate the novel firmware implementation. Algebraic model in complex domain is proposed for the system analyzing. Calibration procedure of a signal path is considered for a multi-channel control. Identification of the system parameters is carried out by the least squares method application. Control tables: Feed-Forward and Set- Point are determined for the required cavity performance, according to the recognized process. Feedback loop is tuned by fitting a complex gain of a corrector unit. Adaptive control algorithm is applied for feed-forward and feedback modes. Experimental results are presented for a cavity representative operation. (orig.)

  15. Development and application of a generic CFD toolkit covering the heat flows in combined solid-liquid systems with emphasis on the thermal design of HiLumi superconducting magnets

    Science.gov (United States)

    Bozza, Gennaro; Malecha, Ziemowit M.; Van Weelderen, Rob

    2016-12-01

    The main objective of this work is to develop a robust multi-region numerical toolkit for the modeling of heat flows in combined solid-liquid systems. Specifically heat transfer in complex cryogenic system geometries involving super-fluid helium. The incentive originates from the need to support the design of superconductive magnets in the framework of the HiLumi-LHC project (Brüning and Rossi, 2015) [1]. The intent is, instead of solving heat flows in restricted domains, to be able to model a full magnet section in one go including all relevant construction details as accurately as possible. The toolkit was applied to the so-called MQXF quadrupole magnet design. Parametrisation studies were used to find a compromise in thermal design and electro-mechanical construction constraints. The cooling performance is evaluated in terms of temperature margin of the magnets under full steady state heat load conditions and in terms of maximal sustainable load. We also present transient response to pulse heat loads of varying duration and power and the system response to time-varying cold source temperatures.

  16. The Special Educator's Toolkit: Everything You Need to Organize, Manage, and Monitor Your Classroom

    Science.gov (United States)

    Golden, Cindy

    2012-01-01

    Overwhelmed special educators: Reduce your stress and support student success with this practical toolkit for whole-classroom organization. A lifesaver for special educators in any K-12 setting, this book-and-CD set will help teachers expertly manage everything, from schedules and paperwork to student supports and behavior plans. Cindy Golden, a…

  17. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    Science.gov (United States)

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  18. Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications

    NARCIS (Netherlands)

    Gonsalves, Atish; Ternier, Stefaan; De Vries, Fred; Specht, Marcus

    2013-01-01

    Gonsalves, A., Ternier, S., De Vries, F., & Specht, M. (2012, 16-18 October). Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications. Presentation given at the 11th World Conference on Mobile and Contextual Learning (mLearn 2012), Helsinki, Finland.

  19. Models of f(R) cosmic acceleration that evade solar system tests

    International Nuclear Information System (INIS)

    Hu, Wayne; Sawicki, Ignacy

    2007-01-01

    We study a class of metric-variation f(R) models that accelerates the expansion without a cosmological constant and satisfies both cosmological and solar-system tests in the small-field limit of the parameter space. Solar-system tests alone place only weak bounds on these models, since the additional scalar degree of freedom is locked to the high-curvature general-relativistic prediction across more than 25 orders of magnitude in density, out through the solar corona. This agreement requires that the galactic halo be of sufficient extent to maintain the galaxy at high curvature in the presence of the low-curvature cosmological background. If the galactic halo and local environment in f(R) models do not have substantially deeper potentials than expected in ΛCDM, then cosmological field amplitudes |f R | > or approx.10 -6 will cause the galactic interior to evolve to low curvature during the acceleration epoch. Viability of large-deviation models therefore rests on the structure and evolution of the galactic halo, requiring cosmological simulations of f(R) models, and not directly on solar-system tests. Even small deviations that conservatively satisfy both galactic and solar-system constraints can still be tested by future, percent-level measurements of the linear power spectrum, while they remain undetectable to cosmological-distance measures. Although we illustrate these effects in a specific class of models, the requirements on f(R) are phrased in a nearly model-independent manner

  20. Cyber security awareness toolkit for national security: an approach to South Africa's cyber security policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives...

  1. Nuclear reaction models - source term estimation for safety design in accelerators

    International Nuclear Information System (INIS)

    Nandy, Maitreyee

    2013-01-01

    Accelerator driven subcritical system (ADSS) employs proton induced spallation reaction at a few GeV. Safety design of these systems involves source term estimation in two steps - multiple fragmentation of the target and n+γ emission through a fast process followed by statistical decay of the primary fragments. The prompt radiation field is estimated in the framework of quantum molecular dynamics (QMD) theory, intra-nuclear cascade or Monte Carlo calculations. A few nuclear reaction model codes used for this purpose are QMD, JQMD, Bertini, INCL4, PHITS, followed by statistical decay codes like ABLA, GEM, GEMINI, etc. In the case of electron accelerators photons and photoneutrons dominate the prompt radiation field. High energy photon yield through Bremsstrahlung is estimated in the framework of Born approximation while photoneutron production is calculated using giant dipole resonance and quasi-deuteron formation cross section. In this talk hybrid and exciton PEQ models and QMD formalism will be discussed briefly

  2. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  3. A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team

    Science.gov (United States)

    Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome

    2013-12-01

    The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation

  4. GPU-accelerated atmospheric chemical kinetics in the ECHAM/MESSy (EMAC Earth system model (version 2.52

    Directory of Open Access Journals (Sweden)

    M. Alvanos

    2017-10-01

    Full Text Available This paper presents an application of GPU accelerators in Earth system modeling. We focus on atmospheric chemical kinetics, one of the most computationally intensive tasks in climate–chemistry model simulations. We developed a software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in the global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC, used to study climate change and air quality scenarios. A source-to-source compiler outputs a CUDA-compatible kernel by parsing the FORTRAN code generated by the Kinetic PreProcessor (KPP general analysis tool. All Rosenbrock methods that are available in the KPP numerical library are supported.Performance evaluation, using Fermi and Pascal CUDA-enabled GPU accelerators, shows achieved speed-ups of 4. 5 ×  and 20. 4 × , respectively, of the kernel execution time. A node-to-node real-world production performance comparison shows a 1. 75 ×  speed-up over the non-accelerated application using the KPP three-stage Rosenbrock solver. We provide a detailed description of the code optimizations used to improve the performance including memory optimizations, control code simplification, and reduction of idle time. The accuracy and correctness of the accelerated implementation are evaluated by comparing to the CPU-only code of the application. The median relative difference is found to be less than 0.000000001 % when comparing the output of the accelerated kernel the CPU-only code.The approach followed, including the computational workload division, and the developed GPU solver code can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications.

  5. Complement inhibition accelerates regeneration in a model of peripheral nerve injury

    NARCIS (Netherlands)

    Ramaglia, Valeria; Tannemaat, Martijn Rudolf; de Kok, Maryla; Wolterman, Ruud; Vigar, Miriam Ann; King, Rosalind Helen Mary; Morgan, Bryan Paul; Baas, Frank

    2009-01-01

    Complement (C) activation is a crucial event in peripheral nerve degeneration but its effect on the subsequent regeneration is unknown. Here we show that genetic deficiency of the sixth C component, C6, accelerates axonal regeneration and recovery in a rat model of sciatic nerve injury. Foot-flick

  6. Numerical modeling of time domain 3-D problems in accelerator physics

    International Nuclear Information System (INIS)

    Harfoush, F.A.; Jurgens, T.G.

    1990-06-01

    Time domain analysis is relevant in particle accelerators to study the electromagnetic field interaction of a moving source particle on a lagging test particle as the particles pass an accelerating cavity or some other structure. These fields are called wake fields. The travelling beam inside a beam pipe may undergo more complicated interactions with its environment due to the presence of other irregularities like wires, thin slots, joints and other types of obstacles. Analytical solutions of such problems is impossible and one has to resort to a numerical method. In this paper we present results of our first attempt to model these problems in 3-D using our finite difference time domain (FDTD) code. 10 refs., 9 figs

  7. Pressure fluctuation caused by moderate acceleration

    Science.gov (United States)

    Tagawa, Yoshiyuki; Kurihara, Chihiro; Kiyama, Akihito

    2017-11-01

    Pressure fluctuation caused by acceleration of a liquid column is observed in various important technologies, e.g. water-hammer in a pipeline. The magnitude of fluctuation can be estimated by two different approaches: When the duration time of acceleration is much shorter than the propagation time for a pressure wave to travel the length of the liquid column, e.g. sudden valve closure for a long pipe, Joukowsky equation is applied. In contrast, if the acceleration duration is much longer, the liquid is modeled as a rigid column, ignoring compressibility of the fluid. However, many of practical cases exist between these two extremes. In this study we propose a model describing pressure fluctuation when the duration of acceleration is in the same order of the propagation time for a pressure wave, i.e. under moderate acceleration. The novel model considers both temporal and spatial evolutions of pressure propagation as well as gradual pressure rise during the acceleration. We conduct experiments in which we impose acceleration to a liquid with varying the length of the liquid column, acceleration duration, and properties of liquids. The ratio between the acceleration duration and the propagation time is in the range of 0.02 - 2. The model agrees well with measurement results. JSPS KAKENHI Grant Numbers 26709007 and 17H01246.

  8. Characterizing a Model of Coronal Heating and Solar Wind Acceleration Based on Wave Turbulence.

    Science.gov (United States)

    Downs, C.; Lionello, R.; Mikic, Z.; Linker, J.; Velli, M.

    2014-12-01

    Understanding the nature of coronal heating and solar wind acceleration is a key goal in solar and heliospheric research. While there have been many theoretical advances in both topics, including suggestions that they may be intimately related, the inherent scale coupling and complexity of these phenomena limits our ability to construct models that test them on a fundamental level for realistic solar conditions. At the same time, there is an ever increasing impetus to improve our spaceweather models, and incorporating treatments for these processes that capture their basic features while remaining tractable is an important goal. With this in mind, I will give an overview of our exploration of a wave-turbulence driven (WTD) model for coronal heating and solar wind acceleration based on low-frequency Alfvénic turbulence. Here we attempt to bridge the gap between theory and practical modeling by exploring this model in 1D HD and multi-dimensional MHD contexts. The key questions that we explore are: What properties must the model possess to be a viable model for coronal heating? What is the influence of the magnetic field topology (open, closed, rapidly expanding)? And can we simultaneously capture coronal heating and solar wind acceleration with such a quasi-steady formulation? Our initial results suggest that a WTD based formulation performs adequately for a variety of solar and heliospheric conditions, while significantly reducing the number of free parameters when compared to empirical heating and solar wind models. The challenges, applications, and future prospects of this type of approach will also be discussed.

  9. Cyber security awareness toolkit for national security: An approach to South Africa’s cybersecurity policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed Cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives as well...

  10. Urban Teacher Academy Project Toolkit: A Guide to Developing High School Teaching Career Academies.

    Science.gov (United States)

    Berrigan, Anne; Schwartz, Shirley

    There is an urgent need not only to attract more people into the teaching profession but also to build a more diverse, highly qualified, and culturally sensitive teaching force that can meet the needs of a rapidly changing school-age population. This Toolkit takes best practices from high school teacher academies around the United States and…

  11. The auroral electron accelerator

    International Nuclear Information System (INIS)

    Bryant, D.A.; Hall, D.S.

    1989-01-01

    A model of the auroral electron acceleration process is presented in which the electrons are accelerated resonantly by lower-hybrid waves. The essentially stochastic acceleration process is approximated for the purposes of computation by a deterministic model involving an empirically derived energy transfer function. The empirical function, which is consistent with all that is known of electron energization by lower-hybrid waves, allows many, possibly all, observed features of the electron distribution to be reproduced. It is suggested that the process occurs widely in both space and laboratory plasmas. (author)

  12. Software Toolkits: Practical Aspects of the Internet of Things—A Survey

    OpenAIRE

    Wang, Feng; Hu, Liang; Zhou, Jin; Wu, Yang; Hu, Jiejun; Zhao, Kuo

    2015-01-01

    The Internet of Things (IoT) is neither science fiction nor industry hype; rather it is based on solid technological advances and visions of network ubiquity that are zealously being realized. The paper serves to provide guidance regarding the practical aspects of the IoT. Such guidance is largely missing in the current literature in which the focus has been more on research problems and less on issues describing how to set up an IoT system and what software toolkits are required. This paper ...

  13. Life prediction of OLED for constant-stress accelerated degradation tests using luminance decaying model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jianping, E-mail: jpzhanglzu@163.com [College of Energy and Mechanical Engineering, Shanghai University of Electric Power, Shanghai 200090 (China); Li, Wenbin [College of Energy and Mechanical Engineering, Shanghai University of Electric Power, Shanghai 200090 (China); Cheng, Guoliang; Chen, Xiao [Shanghai Tianyi Electric Co., Ltd., Shanghai 201611 (China); Wu, Helen [School of Computing, Engineering and Mathematics, University of Western Sydney, Sydney 2751 (Australia); Herman Shen, M.-H. [Department of Mechanical and Aerospace Engineering, The Ohio State University, OH 43210 (United States)

    2014-10-15

    In order to acquire the life information of organic light emitting diode (OLED), three groups of constant stress accelerated degradation tests are performed to obtain the luminance decaying data of samples under the condition that the luminance and the current are respectively selected as the indicator of performance degradation and the test stress. Weibull function is applied to describe the relationship between luminance decaying and time, least square method (LSM) is employed to calculate the shape parameter and scale parameter, and the life prediction of OLED is achieved. The numerical results indicate that the accelerated degradation test and the luminance decaying model reveal the luminance decaying law of OLED. The luminance decaying formula fits the test data very well, and the average error of fitting value compared with the test data is small. Furthermore, the accuracy of the OLED life predicted by luminance decaying model is high, which enable rapid estimation of OLED life and provide significant guidelines to help engineers make decisions in design and manufacturing strategy from the aspect of reliability life. - Highlights: • We gain luminance decaying data by accelerated degradation tests on OLED. • The luminance decaying model objectively reveals the decaying law of OLED luminance. • The least square method (LSM) is employed to calculate Weibull parameters. • The plan designed for accelerated degradation tests proves to be feasible. • The accuracy of the OLED life and the luminance decaying fitting formula is high.

  14. Life prediction of OLED for constant-stress accelerated degradation tests using luminance decaying model

    International Nuclear Information System (INIS)

    Zhang, Jianping; Li, Wenbin; Cheng, Guoliang; Chen, Xiao; Wu, Helen; Herman Shen, M.-H.

    2014-01-01

    In order to acquire the life information of organic light emitting diode (OLED), three groups of constant stress accelerated degradation tests are performed to obtain the luminance decaying data of samples under the condition that the luminance and the current are respectively selected as the indicator of performance degradation and the test stress. Weibull function is applied to describe the relationship between luminance decaying and time, least square method (LSM) is employed to calculate the shape parameter and scale parameter, and the life prediction of OLED is achieved. The numerical results indicate that the accelerated degradation test and the luminance decaying model reveal the luminance decaying law of OLED. The luminance decaying formula fits the test data very well, and the average error of fitting value compared with the test data is small. Furthermore, the accuracy of the OLED life predicted by luminance decaying model is high, which enable rapid estimation of OLED life and provide significant guidelines to help engineers make decisions in design and manufacturing strategy from the aspect of reliability life. - Highlights: • We gain luminance decaying data by accelerated degradation tests on OLED. • The luminance decaying model objectively reveals the decaying law of OLED luminance. • The least square method (LSM) is employed to calculate Weibull parameters. • The plan designed for accelerated degradation tests proves to be feasible. • The accuracy of the OLED life and the luminance decaying fitting formula is high

  15. OpenDBDDAS Toolkit: Secure MapReduce and Hadoop-like Systems

    KAUST Repository

    Fabiano, Enrico

    2015-06-01

    The OpenDBDDAS Toolkit is a software framework to provide support for more easily creating and expanding dynamic big data-driven application systems (DBDDAS) that are common in environmental systems, many engineering applications, disaster management, traffic management, and manufacturing. In this paper, we describe key features needed to implement a secure MapReduce and Hadoop-like system for high performance clusters that guarantees a certain level of privacy of data from other concurrent users of the system. We also provide examples of a secure MapReduce prototype and compare it to another high performance MapReduce, MR-MPI.

  16. Two-phase flow dynamics in a model steam generator under vertical acceleration oscillation field

    International Nuclear Information System (INIS)

    Ishida, T.; Teshima, N.; Sakurai, S.

    1992-01-01

    The influence of periodically varying acceleration on hydrodynamic response has been studied experimentally using an experimental rig which models a marine reactor subject to vertical motion. The effect on the primary loop is small, but the effect on the secondary loop is large. The variables of the secondary loop, such as circulation flow rate and water level, oscillate with acceleration. The variation of gains in frequency response is analysed. The variations of flow in the secondary loop and in the downcome water level, increase in proportion to the acceleration. The effect of the flow resistance in the secondary loop on the two-phase flow dynamics is clarified. (7 figures) (Author)

  17. The Exoplanet Characterization ToolKit (ExoCTK)

    Science.gov (United States)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  18. Use of Remote Sensing Data to Enhance the National Weather Service (NWS) Storm Damage Toolkit

    Science.gov (United States)

    Jedlovec, Gary; Molthan, Andrew; White, Kris; Burks, Jason; Stellman, Keith; Smith, Matthew

    2012-01-01

    SPoRT is improving the use of near real-time satellite data in response to severe weather events and other diasters. Supported through NASA s Applied Sciences Program. Planned interagency collaboration to support NOAA s Damage Assessment Toolkit, with spinoff opportunities to support other entities such as USGS and FEMA.

  19. ELEGANT: A flexible SDDS-compliant code for accelerator simulation

    International Nuclear Information System (INIS)

    Borland, M.

    2000-01-01

    ELEGANT (ELEctron Generation ANd Tracking) is the principle accelerator simulation code used at the Advanced Photon Source (APS) for circular and one-pass machines. Capabilities include 6-D tracking using matrices up to third order, canonical integration, and numerical integration. Standard beamline elements are supported, as well as coherent synchrotron radiation, wakefields, rf elements, kickers, apertures, scattering, and more. In addition to tracking with and without errors, ELEGANT performs optimization of tracked properties, as well as computation and optimization of Twiss parameters, radiation integrals, matrices, and floor coordinates. Orbit/trajectory, tune, and chromaticity correction are supported. ELEGANT is fully compliant with the Self Describing Data Sets (SDDS) file protocol, and hence uses the SDDS Toolkit for pre- and post-processing. This permits users to prepare scripts to run the code in a flexible and automated fashion. It is particularly well suited to multistage simulation and concurrent simulation on many workstations. Several examples of complex projects performed with ELEGANT are given, including top-up safety analysis of the APS and design of the APS bunch compressor

  20. Qualitative models of magnetic field accelerated propagation in a plasma due to the Hall effect

    International Nuclear Information System (INIS)

    Kukushkin, A.B.; Cherepanov, K.V.

    2000-01-01

    Two qualitatively new models of accelerated magnetic field propagation (relative to normal diffusion) in a plasma due to the Hall effect are developed within the frames of the electron magnetic hydrodynamics. The first model is based on a simple hydrodynamic approach, which, in particular, reproduces the number of known theoretical results. The second one makes it possible to obtain exact analytical description of the basic characteristics of the magnetic field accelerated propagation in a inhomogeneous iso-thermic plasma, namely, the magnetic field front and its effective width [ru