WorldWideScience

Sample records for toolkit enabling wider

  1. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  2. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  3. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  4. Solar Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration National Dataset Toolkit Solar Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S . regional solar generation integration studies. It will provide modeled, coherent subhourly solar power data

  5. Local Safety Toolkit: Enabling safe communities of opportunity

    CSIR Research Space (South Africa)

    Holtmann, B

    2010-08-31

    Full Text Available remain inadequate to achieve safety. The Local Safety Toolkit supports a strategy for a Safe South Africa through the implementation of a model for a Safe Community of Opportunity. The model is the outcome of work undertaken over the course of the past...

  6. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  7. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  8. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  9. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  10. A software toolkit for implementing low-cost virtual reality training systems

    International Nuclear Information System (INIS)

    Louka, Michael N.

    1999-04-01

    VR is a powerful technology for implementing training systems but better tools are needed to achieve wider usage and acceptance for desktop computer-based training applications. A need has been identified for a software tool kit to support the efficient implementation of well-structured desktop VR training systems. A powerful toolkit for implementing scalable low-cost VR training applications is described in this report (author) (ml)

  11. An Overview of the Geant4 Toolkit

    CERN Document Server

    Apostolakis, John

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation trough matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. App...

  12. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  13. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  14. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  15. The Knowledge Translation Toolkit: Bridging the Know–Do Gap: A ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-06-06

    Jun 6, 2011 ... It presents the theories, tools, and strategies required to encourage and enable ... Toolkit: Bridging the Know–Do Gap: A Resource for Researchers ... violence, and make digital platforms work for inclusive development.

  16. Tribal Green Building Toolkit

    Science.gov (United States)

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  17. Geant4 - A Simulation Toolkit

    International Nuclear Information System (INIS)

    2002-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics

  18. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  19. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  20. Application experiences with the Globus toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  1. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  2. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  3. An Overview of the GEANT4 Toolkit

    International Nuclear Information System (INIS)

    Apostolakis, John; CERN; Wright, Dennis H.

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments

  4. Perl Template Toolkit

    CERN Document Server

    Chamberlain, Darren; Cross, David; Torkington, Nathan; Diaz, tatiana Apandi

    2004-01-01

    Among the many different approaches to "templating" with Perl--such as Embperl, Mason, HTML::Template, and hundreds of other lesser known systems--the Template Toolkit is widely recognized as one of the most versatile. Like other templating systems, the Template Toolkit allows programmers to embed Perl code and custom macros into HTML documents in order to create customized documents on the fly. But unlike the others, the Template Toolkit is as facile at producing HTML as it is at producing XML, PDF, or any other output format. And because it has its own simple templating language, templates

  5. Srijan: a graphical toolkit for sensor network macroprogramming

    OpenAIRE

    Pathak , Animesh; Gowda , Mahanth K.

    2009-01-01

    International audience; Macroprogramming is an application development technique for wireless sensor networks (WSNs) where the developer specifies the behavior of the system, as opposed to that of the constituent nodes. In this proposed demonstration, we would like to present Srijan, a toolkit that enables application development for WSNs in a graphical manner using data-driven macroprogramming. It can be used in various stages of application development, viz. i) specification of application ...

  6. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We compare hand-crafted custom code to polylithic and monolithic toolkit-based solutions. Polylithic toolkits follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor...

  7. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.

  8. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  9. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  10. The Sense-It App: A Smartphone Sensor Toolkit for Citizen Inquiry Learning

    Science.gov (United States)

    Sharples, Mike; Aristeidou, Maria; Villasclaras-Fernández, Eloy; Herodotou, Christothea; Scanlon, Eileen

    2017-01-01

    The authors describe the design and formative evaluation of a sensor toolkit for Android smartphones and tablets that supports inquiry-based science learning. The Sense-it app enables a user to access all the motion, environmental and position sensors available on a device, linking these to a website for shared crowd-sourced investigations. The…

  11. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims to communi......The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...... to communicate best practices in conserving biodiversity and sustaining ecosystem services to potential users and to promote the wise-use of aquatic resources, improve livelihoods and enhance policy information....

  12. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  13. Supporting LGBT Communities: Police ToolKit

    OpenAIRE

    Vasquez del Aguila, Ernesto; Franey, Paul

    2013-01-01

    This toolkit provides police forces with practical educational tools, which can be used as part of a comprehensive LGBT strategy centred on diversity, equality, and non-discrimination. These materials are based on lessons learned through real life policing experiences with LGBT persons. The Toolkit is divided into seven scenarios where police awareness of LGBT issues has been identified as important. The toolkit employs a practical, scenario-based, problem-solving approach to help police offi...

  14. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT

    Directory of Open Access Journals (Sweden)

    Mair Frances

    2010-10-01

    Full Text Available Abstract Background The use of Information and Communication Technology (ICT or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice. This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience. Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations.

  15. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We describe Jazz (a polylithic toolkit) and Piccolo (a monolithic toolkit), each of which we built to support interactive 2D structured graphics applications in general, and Zoomable User Interface applications in particular...

  16. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  17. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  18. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  19. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  20. Wind Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and Western Wind Integration Data Set. It supports the next generation of wind integration studies. WIND

  1. A qualitative study of clinic and community member perspectives on intervention toolkits: "Unless the toolkit is used it won't help solve the problem".

    Science.gov (United States)

    Davis, Melinda M; Howk, Sonya; Spurlock, Margaret; McGinnis, Paul B; Cohen, Deborah J; Fagnan, Lyle J

    2017-07-18

    Intervention toolkits are common products of grant-funded research in public health and primary care settings. Toolkits are designed to address the knowledge translation gap by speeding implementation and dissemination of research into practice. However, few studies describe characteristics of effective intervention toolkits and their implementation. Therefore, we conducted this study to explore what clinic and community-based users want in intervention toolkits and to identify the factors that support application in practice. In this qualitative descriptive study we conducted focus groups and interviews with a purposive sample of community health coalition members, public health experts, and primary care professionals between November 2010 and January 2012. The transdisciplinary research team used thematic analysis to identify themes and a cross-case comparative analysis to explore variation by participant role and toolkit experience. Ninety six participants representing primary care (n = 54, 56%) and community settings (n = 42, 44%) participated in 18 sessions (13 focus groups, five key informant interviews). Participants ranged from those naïve through expert in toolkit development; many reported limited application of toolkits in actual practice. Participants wanted toolkits targeted at the right audience and demonstrated to be effective. Well organized toolkits, often with a quick start guide, with tools that were easy to tailor and apply were desired. Irrespective of perceived quality, participants experienced with practice change emphasized that leadership, staff buy-in, and facilitative support was essential for intervention toolkits to be translated into changes in clinic or public -health practice. Given the emphasis on toolkits in supporting implementation and dissemination of research and clinical guidelines, studies are warranted to determine when and how toolkits are used. Funders, policy makers, researchers, and leaders in primary care and

  2. Energy retrofit analysis toolkits for commercial buildings: A review

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Taylor-Lange, Sarah C.

    2015-01-01

    Retrofit analysis toolkits can be used to optimize energy or cost savings from retrofit strategies, accelerating the adoption of ECMs (energy conservation measures) in buildings. This paper provides an up-to-date review of the features and capabilities of 18 energy retrofit toolkits, including ECMs and the calculation engines. The fidelity of the calculation techniques, a driving component of retrofit toolkits, were evaluated. An evaluation of the issues that hinder effective retrofit analysis in terms of accessibility, usability, data requirement, and the application of efficiency measures, provides valuable insights into advancing the field forward. Following this review the general concepts were determined: (1) toolkits developed primarily in the private sector use empirically data-driven methods or benchmarking to provide ease of use, (2) almost all of the toolkits which used EnergyPlus or DOE-2 were freely accessible, but suffered from complexity, longer data input and simulation run time, (3) in general, there appeared to be a fine line between having too much detail resulting in a long analysis time or too little detail which sacrificed modeling fidelity. These insights provide an opportunity to enhance the design and development of existing and new retrofit toolkits in the future. - Highlights: • Retrofit analysis toolkits can accelerate the adoption of energy efficiency measures. • A comprehensive review of 19 retrofit analysis toolkits was conducted. • Retrofit toolkits have diverse features, data requirement and computing methods. • Empirical data-driven, normative and detailed energy modeling methods are used. • Identified immediate areas for improvement for retrofit analysis toolkits

  3. SIGKit: Software for Introductory Geophysics Toolkit

    Science.gov (United States)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  4. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    Science.gov (United States)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  5. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  6. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    Science.gov (United States)

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  7. Measuring employee satisfaction in new offices - the WODI toolkit

    NARCIS (Netherlands)

    Maarleveld, M.; Volker, L.; van der Voordt, Theo

    2009-01-01

    Purpose: This paper presents a toolkit to measure employee satisfaction and perceived labour productivity as affected by different workplace strategies. The toolkit is being illustrated by a case study of the Dutch Revenue Service.
    Methodology: The toolkit has been developed by a review of

  8. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  9. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  10. Transportation librarian's toolkit

    Science.gov (United States)

    2007-12-01

    The Transportation Librarians Toolkit is a product of the Transportation Library Connectivity pooled fund study, TPF- 5(105), a collaborative, grass-roots effort by transportation libraries to enhance information accessibility and professional expert...

  11. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  12. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  13. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  14. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  15. An Industrial Physics Toolkit

    Science.gov (United States)

    Cummings, Bill

    2004-03-01

    Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.

  16. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing

    DEFF Research Database (Denmark)

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai

    2018-01-01

    engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification......, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector...... construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes....

  17. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Science.gov (United States)

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  18. WING/WORLD: An Open Experimental Toolkit for the Design and Deployment of IEEE 802.11-Based Wireless Mesh Networks Testbeds

    Directory of Open Access Journals (Sweden)

    Daniele Miorandi

    2010-01-01

    Full Text Available Wireless Mesh Networks represent an interesting instance of light-infrastructure wireless networks. Due to their flexibility and resiliency to network failures, wireless mesh networks are particularly suitable for incremental and rapid deployments of wireless access networks in both metropolitan and rural areas. This paper illustrates the design and development of an open toolkit aimed at supporting the design of different solutions for wireless mesh networking by enabling real evaluation, validation, and demonstration. The resulting testbed is based on off-the-shelf hardware components and open-source software and is focused on IEEE 802.11 commodity devices. The software toolkit is based on an “open” philosophy and aims at providing the scientific community with a tool for effective and reproducible performance analysis of WMNs. The paper describes the architecture of the toolkit, and its core functionalities, as well as its potential evolutions.

  19. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  20. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available of such a toolkit facilitates and increases productivity during subsequent tool development: “develop once and use many times”. The concept of an extendible toolkit lends itself naturally to the open-source philosophy, where the toolkit user-base develops...

  1. The self-describing data sets file protocol and Toolkit

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.

    1995-01-01

    The Self-Describing Data Sets (SDDS) file protocol continues to be used extensively in commissioning the Advanced Photon Source (APS) accelerator complex. SDDS protocol has proved useful primarily due to the existence of the SDDS Toolkit, a growing set of about 60 generic commandline programs that read and/or write SDDS files. The SDDS Toolkit is also used extensively for simulation postprocessing, giving physicists a single environment for experiment and simulation. With the Toolkit, new SDDS data is displayed and subjected to complex processing without developing new programs. Data from EPICS, lab instruments, simulation, and other sources are easily integrated. Because the SDDS tools are commandline-based, data processing scripts are readily written using the user's preferred shell language. Since users work within a UNIX shell rather than an application-specific shell or GUI, they may add SDDS-compliant programs and scripts to their personal toolkits without restriction or complication. The SDDS Toolkit has been run under UNIX on SUN OS4, HP-UX, and LINUX. Application of SDDS to accelerator operation is being pursued using Tcl/Tk to provide a GUI

  2. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  3. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    Science.gov (United States)

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the

  4. The DLESE Evaluation Toolkit Project

    Science.gov (United States)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  5. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  6. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  7. An open source toolkit for medical imaging de-identification

    International Nuclear Information System (INIS)

    Rodriguez Gonzalez, David; Carpenter, Trevor; Wardlaw, Joanna; Hemert, Jano I. van

    2010-01-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users. (orig.)

  8. Enabling eHealth as a Pathway for Patient Engagement: a Toolkit for Medical Practice.

    Science.gov (United States)

    Graffigna, Guendalina; Barello, Serena; Triberti, Stefano; Wiederhold, Brenda K; Bosio, A Claudio; Riva, Giuseppe

    2014-01-01

    Academic and managerial interest in patient engagement is rapidly earning attention and becoming a necessary tool for researchers, clinicians and policymakers worldwide to manage the increasing burden of chronic conditions. The concept of patient engagement calls for a reframe of healthcare organizations' models and approaches to care. This also requires innovations in the direction of facilitating the exchanges between the patients and the healthcare. eHealth, namely the use of new communication technologies to provide healthcare, is proved to be proposable to innovate healthcare organizations and to improve exchanges between patients and health providers. However, little attention has been still devoted to how to best design eHealth tools in order to engage patients in their care. eHealth tools have to be appropriately designed according to the specific patients' unmet needs and priorities featuring the different phases of the engagement process. Basing on the Patient Engagement model and on the Positive Technology paradigm, we suggest a toolkit of phase-specific technological resources, highlighting their specific potentialities in fostering the patient engagement process.

  9. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    International Nuclear Information System (INIS)

    Zheng, Wei; Zhang, Ming; Zhang, Jing; Zhuang, Ge

    2013-01-01

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system

  10. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Science.gov (United States)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  11. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  12. ImTK: an open source multi-center information management toolkit

    Science.gov (United States)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  13. Communities and Spontaneous Urban Planning: A Toolkit for Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    State-led urban planning is often absent, which creates unsustainable environments and hinders the integration of migrants. Communities' prospects of ... This toolkit is expected to be a viable alternative for planning urban expansion wherever it cannot be carried out through traditional means. The toolkit will be tested in ...

  14. Design-based learning in classrooms using playful digital toolkits

    NARCIS (Netherlands)

    Scheltenaar, K.J.; van der Poel, J.E.C.; Bekker, Tilde

    2015-01-01

    The goal of this paper is to explore how to implement Design Based Learning (DBL) with digital toolkits to teach 21st century skills in (Dutch) schools. It describes the outcomes of a literature study and two design case studies in which such a DBL approach with digital toolkits was iteratively

  15. The Lean and Environment Toolkit

    Science.gov (United States)

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  16. A Cas9-based toolkit to program gene expression in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Apel, Amanda Reider; d'Espaux, Leo; Wehrs, Maren

    2017-01-01

    of these parts via a web-based tool, that automates the generation of DNA fragments for integration. Our system builds upon existing gene editing methods in the thoroughness with which the parts are standardized and characterized, the types and number of parts available and the ease with which our methodology...... can be used to perform genetic edits in yeast. We demonstrated the applicability of this toolkit by optimizing the expression of a challenging but industrially important enzyme, taxadiene synthase (TXS). This approach enabled us to diagnose an issue with TXS solubility, the resolution of which yielded...

  17. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  18. A New GPU-Enabled MODTRAN Thermal Model for the PLUME TRACKER Volcanic Emission Analysis Toolkit

    Science.gov (United States)

    Acharya, P. K.; Berk, A.; Guiang, C.; Kennett, R.; Perkins, T.; Realmuto, V. J.

    2013-12-01

    Real-time quantification of volcanic gaseous and particulate releases is important for (1) recognizing rapid increases in SO2 gaseous emissions which may signal an impending eruption; (2) characterizing ash clouds to enable safe and efficient commercial aviation; and (3) quantifying the impact of volcanic aerosols on climate forcing. The Jet Propulsion Laboratory (JPL) has developed state-of-the-art algorithms, embedded in their analyst-driven Plume Tracker toolkit, for performing SO2, NH3, and CH4 retrievals from remotely sensed multi-spectral Thermal InfraRed spectral imagery. While Plume Tracker provides accurate results, it typically requires extensive analyst time. A major bottleneck in this processing is the relatively slow but accurate FORTRAN-based MODTRAN atmospheric and plume radiance model, developed by Spectral Sciences, Inc. (SSI). To overcome this bottleneck, SSI in collaboration with JPL, is porting these slow thermal radiance algorithms onto massively parallel, relatively inexpensive and commercially-available GPUs. This paper discusses SSI's efforts to accelerate the MODTRAN thermal emission algorithms used by Plume Tracker. Specifically, we are developing a GPU implementation of the Curtis-Godson averaging and the Voigt in-band transmittances from near line center molecular absorption, which comprise the major computational bottleneck. The transmittance calculations were decomposed into separate functions, individually implemented as GPU kernels, and tested for accuracy and performance relative to the original CPU code. Speedup factors of 14 to 30× were realized for individual processing components on an NVIDIA GeForce GTX 295 graphics card with no loss of accuracy. Due to the separate host (CPU) and device (GPU) memory spaces, a redesign of the MODTRAN architecture was required to ensure efficient data transfer between host and device, and to facilitate high parallel throughput. Currently, we are incorporating the separate GPU kernels into a

  19. Opportunities and challenges in the wider adoption of liver and interconnected microphysiological systems.

    Science.gov (United States)

    Hughes, David J; Kostrzewski, Tomasz; Sceats, Emma L

    2017-10-01

    Liver disease represents a growing global health burden. The development of in vitro liver models which allow the study of disease and the prediction of metabolism and drug-induced liver injury in humans remains a challenge. The maintenance of functional primary hepatocytes cultures, the parenchymal cell of the liver, has historically been difficult with dedifferentiation and the consequent loss of hepatic function limiting utility. The desire for longer term functional liver cultures sparked the development of numerous systems, including collagen sandwiches, spheroids, micropatterned co-cultures and liver microphysiological systems. This review will focus on liver microphysiological systems, often referred to as liver-on-a-chip, and broaden to include platforms with interconnected microphysiological systems or multi-organ-chips. The interconnection of microphysiological systems presents the opportunity to explore system level effects, investigate organ cross talk, and address questions which were previously the preserve of animal experimentation. As a field, microphysiological systems have reached a level of maturity suitable for commercialization and consequent evaluation by a wider community of users, in academia and the pharmaceutical industry. Here scientific, operational, and organizational considerations relevant to the wider adoption of microphysiological systems will be discussed. Applications in which microphysiological systems might offer unique scientific insights or enable studies currently feasible only with animal models are described, and challenges which might be addressed to enable wider adoption of the technologies are highlighted. A path forward which envisions the development of microphysiological systems in partnerships between academia, vendors and industry, is proposed. Impact statement Microphysiological systems are in vitro models of human tissues and organs. These systems have advanced rapidly in recent years and are now being

  20. Outage Risk Assessment and Management (ORAM) thermal-hydraulics toolkit

    International Nuclear Information System (INIS)

    Denny, V.E.; Wassel, A.T.; Issacci, F.; Pal Kalra, S.

    2004-01-01

    A PC-based thermal-hydraulic toolkit for use in support of outage optimization, management and risk assessment has been developed. This mechanistic toolkit incorporates simple models of key thermal-hydraulic processes which occur during an outage, such as recovery from or mitigation of outage upsets; this includes heat-up of water pools following loss of shutdown cooling, inadvertent drain down of the RCS, boiloff of coolant inventory, heatup of the uncovered core, and reflux cooling. This paper provides a list of key toolkit elements, briefly describes the technical basis and presents illustrative results for RCS transient behavior during reflux cooling, peak clad temperatures for an uncovered core and RCS response to loss of shutdown cooling. (author)

  1. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    Directory of Open Access Journals (Sweden)

    Hutchison Geoffrey R

    2008-12-01

    Full Text Available Abstract Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java, have different underlying chemical models and have different application programming interfaces (APIs. Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit.

  2. chemf: A purely functional chemistry toolkit.

    Science.gov (United States)

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  3. The Revolution Continues: Newly Discovered Systems Expand the CRISPR-Cas Toolkit.

    Science.gov (United States)

    Murugan, Karthik; Babu, Kesavan; Sundaresan, Ramya; Rajan, Rakhi; Sashital, Dipali G

    2017-10-05

    CRISPR-Cas systems defend prokaryotes against bacteriophages and mobile genetic elements and serve as the basis for revolutionary tools for genetic engineering. Class 2 CRISPR-Cas systems use single Cas endonucleases paired with guide RNAs to cleave complementary nucleic acid targets, enabling programmable sequence-specific targeting with minimal machinery. Recent discoveries of previously unidentified CRISPR-Cas systems have uncovered a deep reservoir of potential biotechnological tools beyond the well-characterized Type II Cas9 systems. Here we review the current mechanistic understanding of newly discovered single-protein Cas endonucleases. Comparison of these Cas effectors reveals substantial mechanistic diversity, underscoring the phylogenetic divergence of related CRISPR-Cas systems. This diversity has enabled further expansion of CRISPR-Cas biotechnological toolkits, with wide-ranging applications from genome editing to diagnostic tools based on various Cas endonuclease activities. These advances highlight the exciting prospects for future tools based on the continually expanding set of CRISPR-Cas systems. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.

    Science.gov (United States)

    Beehler, Gregory P; Lilienthal, Kaitlin R

    2017-02-01

    The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  6. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  7. Validation of Power Output for the WIND Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  8. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  9. Barriers and enablers in the implementation of a provider-based intervention to stimulate culturally appropriate hypertension education

    NARCIS (Netherlands)

    Beune, Erik J. A. J.; Haafkens, Joke A.; Bindels, Patrick J. E.

    2011-01-01

    Objective: To identify barriers and enablers influencing the implementation of an intervention to stimulate culturally appropriate hypertension education (CANE) among health care providers in primary care. Methods: The intervention was piloted in three Dutch health centers. It consists of a toolkit

  10. Sealed radioactive sources toolkit

    International Nuclear Information System (INIS)

    Mac Kenzie, C.

    2005-09-01

    The IAEA has developed a Sealed Radioactive Sources Toolkit to provide information to key groups about the safety and security of sealed radioactive sources. The key groups addressed are officials in government agencies, medical users, industrial users and the scrap metal industry. The general public may also benefit from an understanding of the fundamentals of radiation safety

  11. The ECVET toolkit customization for the nuclear energy sector

    Energy Technology Data Exchange (ETDEWEB)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von [European Commission, Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport

    2015-04-15

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  12. The ECVET toolkit customization for the nuclear energy sector

    International Nuclear Information System (INIS)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von

    2015-01-01

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  13. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    Science.gov (United States)

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  14. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  15. Terrain-Toolkit

    DEFF Research Database (Denmark)

    Wang, Qi; Kaul, Manohar; Long, Cheng

    2014-01-01

    , as will be shown, is used heavily for query processing in spatial databases; and (3) they do not provide the surface distance operator which is fundamental for many applications based on terrain data. Motivated by this, we developed a tool called Terrain-Toolkit for terrain data which accepts a comprehensive set......Terrain data is becoming increasingly popular both in industry and in academia. Many tools have been developed for visualizing terrain data. However, we find that (1) they usually accept very few data formats of terrain data only; (2) they do not support terrain simplification well which...

  16. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  17. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  18. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  19. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  20. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health ® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Field tests of a participatory ergonomics toolkit for Total Worker Health

    Science.gov (United States)

    Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2018-01-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and team-work skills of participants. PMID:28166897

  2. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  3. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    Science.gov (United States)

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  4. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    Science.gov (United States)

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  5. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  6. Practical computational toolkits for dendrimers and dendrons structure design

    Science.gov (United States)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  7. The Populist Toolkit

    OpenAIRE

    Ylä-Anttila, Tuukka Salu Santeri

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  8. Advanced processing and simulation of MRS data using the FID appliance (FID-A)-An open source, MATLAB-based toolkit.

    Science.gov (United States)

    Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie

    2017-01-01

    To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  9. A toolkit for promoting healthy ageing

    NARCIS (Netherlands)

    Jeroen Knevel; Aly Gruppen

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience,

  10. Fragment Impact Toolkit (FIT)

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel Wolf [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Key, Brian P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garcia, Daniel B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  11. Web-based Toolkit for Dynamic Generation of Data Processors

    Science.gov (United States)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  12. User's manual for the two-dimensional transputer graphics toolkit

    Science.gov (United States)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  13. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  14. Web-Altairis: An Internet-Enabled Ground System

    Science.gov (United States)

    Miller, Phil; Coleman, Jason; Gemoets, Darren; Hughes, Kevin

    2000-01-01

    This paper describes Web-Altairis, an Internet-enabled ground system software package funded by the Advanced Automation and Architectures Branch (Code 588) of NASA's Goddard Space Flight Center. Web-Altairis supports the trend towards "lights out" ground systems, where the control center is unattended and problems are resolved by remote operators. This client/server software runs on most popular platforms and provides for remote data visualization using the rich functionality of the VisAGE toolkit. Web-Altairis also supports satellite commanding over the Internet. This paper describes the structure of Web-Altairis and VisAGE, the underlying technologies, the provisions for security, and our experiences in developing and testing the software.

  15. Commercial Building Energy Saver: An energy retrofit analysis toolkit

    International Nuclear Information System (INIS)

    Hong, Tianzhen; Piette, Mary Ann; Chen, Yixing; Lee, Sang Hoon; Taylor-Lange, Sarah C.; Zhang, Rongpeng; Sun, Kaiyu; Price, Phillip

    2015-01-01

    Highlights: • Commercial Building Energy Saver is a powerful toolkit for energy retrofit analysis. • CBES provides benchmarking, load shape analysis, and model-based retrofit assessment. • CBES covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • CBES includes a web app, API, and a database of energy efficiency performance. • CBES API can be extended and integrated with third party energy software tools. - Abstract: Small commercial buildings in the United States consume 47% of the total primary energy of the buildings sector. Retrofitting small and medium commercial buildings poses a huge challenge for owners because they usually lack the expertise and resources to identify and evaluate cost-effective energy retrofit strategies. This paper presents the Commercial Building Energy Saver (CBES), an energy retrofit analysis toolkit, which calculates the energy use of a building, identifies and evaluates retrofit measures in terms of energy savings, energy cost savings and payback. The CBES Toolkit includes a web app (APP) for end users and the CBES Application Programming Interface (API) for integrating CBES with other energy software tools. The toolkit provides a rich set of features including: (1) Energy Benchmarking providing an Energy Star score, (2) Load Shape Analysis to identify potential building operation improvements, (3) Preliminary Retrofit Analysis which uses a custom developed pre-simulated database and, (4) Detailed Retrofit Analysis which utilizes real-time EnergyPlus simulations. CBES includes 100 configurable energy conservation measures (ECMs) that encompass IAQ, technical performance and cost data, for assessing 7 different prototype buildings in 16 climate zones in California and 6 vintages. A case study of a small office building demonstrates the use of the toolkit for retrofit analysis. The development of CBES provides a new contribution to the field by providing a straightforward and uncomplicated decision

  16. Texas Team: Academic Progression and IOM Toolkit.

    Science.gov (United States)

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  17. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  18. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  19. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  20. An interactive RADIANCE toolkit for customizable CT dose monitoring and reporting.

    Science.gov (United States)

    Cook, Tessa S; Sundaram, Anand; Boonn, William W; Kim, Woojin

    2013-08-01

    The need for tools to monitor imaging-related radiation has grown dramatically in recent years. RADIANCE, a freely available open-source dose-monitoring tool, was developed in response to the need for an informatics solution in this realm. A number of open-source as well as commercial solutions have since been developed to enable radiology practices to monitor radiation dose parameters for modalities ranging from computed tomography to radiography to fluoroscopy. However, it is not sufficient to simply collect this data; it is equally important to be able to review it in the appropriate context. Most of the currently available dose-monitoring solutions have some type of reporting capability, such as a real-time dashboard or a static report. Previous versions of RADIANCE have included a real-time dashboard with pre-set screens that plot effective dose estimates according to different criteria, as well as monthly scorecards to summarize dose estimates for individuals within a radiology practice. In this work, we present the RADIANCE toolkit, a customizable reporting solution that allows users to generate reports of interest to them, summarizing a variety of metrics that can be grouped according to useful parameters. The output of the toolkit can be used for real-time dose monitoring or scheduled reporting, such as to a quality assurance committee. Making dose parameter data more accessible and more meaningful to the user promotes dose reduction efforts such as regular protocol review and optimization, and ultimately improves patient care by decreasing unnecessary radiation exposure.

  1. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  2. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  3. Development of a Multimedia Toolkit for Engineering Graphics Education

    Directory of Open Access Journals (Sweden)

    Moudar Zgoul

    2009-09-01

    Full Text Available This paper focuses upon the development of a multimedia toolkit to support the teaching of Engineering Graphics Course. The project used different elements for the toolkit; animations, videos and presentations which were then integrated in a dedicated internet website. The purpose of using these elements is to assist the students building and practicing the needed engineering skills at their own pace as a part of an e-Learning solution. Furthermore, this kit allows students to repeat and view the processes and techniques of graphical construction, and visualization as much as needed, allowing them to follow and practice on their own.

  4. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  5. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  6. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  7. VIDE: The Void IDentification and Examination toolkit

    Science.gov (United States)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N-body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at http://bitbucket.org/cosmicvoids/vide_public and http://www.cosmicvoids.net.

  8. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    Science.gov (United States)

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  9. BAT - The Bayesian Analysis Toolkit

    CERN Document Server

    Caldwell, Allen C; Kröninger, Kevin

    2009-01-01

    We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner. A goodness-of-fit criterion is presented which is intuitive and of great practical use.

  10. IChem: A Versatile Toolkit for Detecting, Comparing, and Predicting Protein-Ligand Interactions.

    Science.gov (United States)

    Da Silva, Franck; Desaphy, Jeremy; Rognan, Didier

    2018-03-20

    Structure-based ligand design requires an exact description of the topology of molecular entities under scrutiny. IChem is a software package that reflects the many contributions of our research group in this area over the last decade. It facilitates and automates many tasks (e.g., ligand/cofactor atom typing, identification of key water molecules) usually left to the modeler's choice. It therefore permits the detection of molecular interactions between two molecules in a very precise and flexible manner. Moreover, IChem enables the conversion of intricate three-dimensional (3D) molecular objects into simple representations (fingerprints, graphs) that facilitate knowledge acquisition at very high throughput. The toolkit is an ideal companion for setting up and performing many structure-based design computations. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  11. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  12. Heart Failure: Self-care to Success: Development and evaluation of a program toolkit.

    Science.gov (United States)

    Bryant, Rebecca

    2017-08-17

    The Heart Failure: Self-care to Success toolkit was developed to assist NPs in empowering patients with heart failure (HF) to improve individual self-care behaviors. This article details the evolution of this toolkit for NPs, its effectiveness with patients with HF, and recommendations for future research and dissemination strategies.

  13. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  14. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  15. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  16. Innovations and Challenges of Implementing a Glucose Gel Toolkit for Neonatal Hypoglycemia.

    Science.gov (United States)

    Hammer, Denise; Pohl, Carla; Jacobs, Peggy J; Kaufman, Susan; Drury, Brenda

    2018-05-24

    Transient neonatal hypoglycemia occurs most commonly in newborns who are small for gestational age, large for gestational age, infants of diabetic mothers, and late preterm infants. An exact blood glucose value has not been determined for neonatal hypoglycemia, and it is important to note that poor neurologic outcomes can occur if hypoglycemia is left untreated. Interventions that separate mothers and newborns, as well as use of formula to treat hypoglycemia, have the potential to disrupt exclusive breastfeeding. To determine whether implementation of a toolkit designed to support staff in the adaptation of the practice change for management of newborns at risk for hypoglycemia, that includes 40% glucose gel in an obstetric unit with a level 2 nursery will decrease admissions to the Intermediate Care Nursery, and increase exclusive breastfeeding. This descriptive study used a retrospective chart review for pre/postimplementation of the Management of Newborns at Risk for Hypoglycemia Toolkit (Toolkit) using a convenience sample of at-risk newborns in the first 2 days of life to evaluate the proposed outcomes. Following implementation of the Toolkit, at-risk newborns had a clinically but not statistically significant 6.5% increase in exclusive breastfeeding and a clinically but not statistically significant 5% decrease in admissions to the Intermediate Care Nursery. The Toolkit was designed for ease of staff use and to improve outcomes for the at-risk newborn. Future research includes replication at other level 2 and level 1 obstetric centers and investigation into the number of 40% glucose gel doses that can safely be administered.

  17. Integrated System Health Management Development Toolkit

    Science.gov (United States)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  18. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  19. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  20. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  1. Monitoring the grid with the Globus Toolkit MDS4

    International Nuclear Information System (INIS)

    Schopf, Jennifer M; Pearlman, Laura; Miller, Neill; Kesselman, Carl; Foster, Ian; D'Arcy, Mike; Chervenak, Ann

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms

  2. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  3. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  4. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    Science.gov (United States)

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  6. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  7. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  8. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  9. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  10. Viewpoint Reading Conference Recommendations in a Wider ...

    African Journals Online (AJOL)

    Viewpoint Reading Conference Recommendations in a Wider Context of Social Change. ... Southern African Journal of Environmental Education. Journal Home · ABOUT THIS JOURNAL ... AJOL African Journals Online. HOW TO USE AJOL.

  11. Engineering control of bacterial cellulose production using a genetic toolkit and a new cellulose-producing strain

    Science.gov (United States)

    Florea, Michael; Hagemann, Henrik; Santosa, Gabriella; Micklem, Chris N.; Spencer-Milnes, Xenia; de Arroyo Garcia, Laura; Paschou, Despoina; Lazenbatt, Christopher; Kong, Deze; Chughtai, Haroon; Jensen, Kirsten; Freemont, Paul S.; Kitney, Richard; Reeve, Benjamin; Ellis, Tom

    2016-01-01

    Bacterial cellulose is a strong and ultrapure form of cellulose produced naturally by several species of the Acetobacteraceae. Its high strength, purity, and biocompatibility make it of great interest to materials science; however, precise control of its biosynthesis has remained a challenge for biotechnology. Here we isolate a strain of Komagataeibacter rhaeticus (K. rhaeticus iGEM) that can produce cellulose at high yields, grow in low-nitrogen conditions, and is highly resistant to toxic chemicals. We achieved external control over its bacterial cellulose production through development of a modular genetic toolkit that enables rational reprogramming of the cell. To further its use as an organism for biotechnology, we sequenced its genome and demonstrate genetic circuits that enable functionalization and patterning of heterologous gene expression within the cellulose matrix. This work lays the foundations for using genetic engineering to produce cellulose-based materials, with numerous applications in basic science, materials engineering, and biotechnology. PMID:27247386

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  13. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    Science.gov (United States)

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  14. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  15. Guest editors' introduction to the 4th issue of Experimental Software and Toolkits (EST-4)

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Kienle, H.M.; Mens, K.

    2014-01-01

    Experimental software and toolkits play a crucial role in computer science. Elsevier’s Science of Computer Programming special issues on Experimental Software and Toolkits (EST) provide a means for academic tool builders to get more visibility and credit for their work, by publishing a paper along

  16. Numerical relativity in spherical coordinates with the Einstein Toolkit

    Science.gov (United States)

    Mewes, Vassilios; Zlochower, Yosef; Campanelli, Manuela; Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.

    2018-04-01

    Numerical relativity codes that do not make assumptions on spatial symmetries most commonly adopt Cartesian coordinates. While these coordinates have many attractive features, spherical coordinates are much better suited to take advantage of approximate symmetries in a number of astrophysical objects, including single stars, black holes, and accretion disks. While the appearance of coordinate singularities often spoils numerical relativity simulations in spherical coordinates, especially in the absence of any symmetry assumptions, it has recently been demonstrated that these problems can be avoided if the coordinate singularities are handled analytically. This is possible with the help of a reference-metric version of the Baumgarte-Shapiro-Shibata-Nakamura formulation together with a proper rescaling of tensorial quantities. In this paper we report on an implementation of this formalism in the Einstein Toolkit. We adapt the Einstein Toolkit infrastructure, originally designed for Cartesian coordinates, to handle spherical coordinates, by providing appropriate boundary conditions at both inner and outer boundaries. We perform numerical simulations for a disturbed Kerr black hole, extract the gravitational wave signal, and demonstrate that the noise in these signals is orders of magnitude smaller when computed on spherical grids rather than Cartesian grids. With the public release of our new Einstein Toolkit thorns, our methods for numerical relativity in spherical coordinates will become available to the entire numerical relativity community.

  17. Enabling Persistent Peace After Negotiated Settlements

    Science.gov (United States)

    2016-12-01

    ascend in your military career . I pray our paths will cross again someday professionally; however, I am sure my family and I will be visiting you...232 Miroslav Feix, “Game Theory Toolkit and Workbook for Defense Analysis Students” (master’s thesis, Naval...Research 41, no 3, (May 2004): 275–371. 119 Feix, Miroslav. “Game Theory Toolkit and Workbook for Defense Analysis Students.” Master’s thesis. Naval

  18. A toolkit for promoting healthy ageing

    OpenAIRE

    Knevel, Jeroen; Gruppen, Aly

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience, preventing loneliness and social participation. Besides some concise background information, we offer you a great diversity of exercises per theme which can help you discuss, assess, change or strengt...

  19. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study.

    Science.gov (United States)

    Lee, Lisa; Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-10-01

    To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Questionnaire-based survey of attendees at a national ePrescribing symposium. 2013 National ePrescribing Symposium in London, UK. Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals' experiences (n = 45; 64.3%) were considered the most useful types of content. There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning.

  20. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  1. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... relevant to reducing air pollution from oil and natural gas production and processing. The Department of... environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to.... technologies. The Toolkit will support the President's National Export Initiative by fostering export...

  2. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  3. The Best Ever Alarm System Toolkit

    International Nuclear Information System (INIS)

    Kasemir, Kay; Chen, Xihui; Danilova, Ekaterina N.

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good 'alarm philosophy' on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  4. A Genetic Toolkit for Dissecting Dopamine Circuit Function in Drosophila

    Directory of Open Access Journals (Sweden)

    Tingting Xie

    2018-04-01

    Full Text Available Summary: The neuromodulator dopamine (DA plays a key role in motor control, motivated behaviors, and higher-order cognitive processes. Dissecting how these DA neural networks tune the activity of local neural circuits to regulate behavior requires tools for manipulating small groups of DA neurons. To address this need, we assembled a genetic toolkit that allows for an exquisite level of control over the DA neural network in Drosophila. To further refine targeting of specific DA neurons, we also created reagents that allow for the conversion of any existing GAL4 line into Split GAL4 or GAL80 lines. We demonstrated how this toolkit can be used with recently developed computational methods to rapidly generate additional reagents for manipulating small subsets or individual DA neurons. Finally, we used the toolkit to reveal a dynamic interaction between a small subset of DA neurons and rearing conditions in a social space behavioral assay. : The rapid analysis of how dopaminergic circuits regulate behavior is limited by the genetic tools available to target and manipulate small numbers of these neurons. Xie et al. present genetic tools in Drosophila that allow rational targeting of sparse dopaminergic neuronal subsets and selective knockdown of dopamine signaling. Keywords: dopamine, genetics, behavior, neural circuits, neuromodulation, Drosophila

  5. X-CSIT: a toolkit for simulating 2D pixel detectors

    Science.gov (United States)

    Joy, A.; Wing, M.; Hauf, S.; Kuster, M.; Rüter, T.

    2015-04-01

    A new, modular toolkit for creating simulations of 2D X-ray pixel detectors, X-CSIT (X-ray Camera SImulation Toolkit), is being developed. The toolkit uses three sequential simulations of detector processes which model photon interactions, electron charge cloud spreading with a high charge density plasma model and common electronic components used in detector readout. In addition, because of the wide variety in pixel detector design, X-CSIT has been designed as a modular platform so that existing functions can be modified or additional functionality added if the specific design of a detector demands it. X-CSIT will be used to create simulations of the detectors at the European XFEL, including three bespoke 2D detectors: the Adaptive Gain Integrating Pixel Detector (AGIPD), Large Pixel Detector (LPD) and DePFET Sensor with Signal Compression (DSSC). These simulations will be used by the detector group at the European XFEL for detector characterisation and calibration. For this purpose, X-CSIT has been integrated into the European XFEL's software framework, Karabo. This will further make it available to users to aid with the planning of experiments and analysis of data. In addition, X-CSIT will be released as a standalone, open source version for other users, collaborations and groups intending to create simulations of their own detectors.

  6. A Teacher Tablet Toolkit to meet the challenges posed by 21st century rural teaching and learning environments

    Directory of Open Access Journals (Sweden)

    Adèle Botha

    2015-11-01

    Full Text Available This article draws upon the experiences gained in participating in an Information and Communication Technology for Rural Education (ICT4RED initiative, as part of a larger Technology for Rural Education project (TECH4RED in Cofimvaba in the Eastern Cape Province of South Africa. The aim of this paper is to describe the conceptualisation, design and application of an innovative teacher professional development course for rural teachers, enabling them to use tablets to support teaching and learning in their classrooms. The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design Science Research Process, to be incorporated in the ICT4RED Teacher Professional Development Course. The resulting course aims to equip participating teachers with a toolkit consisting of technology hardware, pragmatic pedagogical and technology knowledge and skills, and practice based experience. The significance of game design elements such as simulation and fun, technology in need rather than in case, adequate scaffolding and a clear learning path with interim learning goals are noted.

  7. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  8. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  9. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    Science.gov (United States)

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; P<.001). Of the 15 teaching skills assessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  10. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    Directory of Open Access Journals (Sweden)

    Jon Smart

    2018-02-01

    Full Text Available Introduction: Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods: As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Results: Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Conclusion: Residents from across the world collaborated and convened to reach a consensus on high-yield—and potentially high-impact—lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  11. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit.

    Science.gov (United States)

    Chung, Arlene S; Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-03-01

    Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Residents from across the world collaborated and convened to reach a consensus on high-yield-and potentially high-impact-lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  12. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    Science.gov (United States)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  13. SlideToolkit: an assistive toolset for the histological quantification of whole slide images.

    Directory of Open Access Journals (Sweden)

    Bastiaan G L Nelissen

    Full Text Available The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition, whole slide images are collected and converted to TIFF files. In the second step (preparation, files are organized. The third step (tiles, creates multiple manageable tiles to count. In the fourth step (analysis, tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets.

  14. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  15. Applications toolkit for accelerator control and analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1997-01-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator operation and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provide a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 70 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creating of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This last factor allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to operational tasks such as orbit correction, configuration management, and data review will be discussed

  16. ISRNA: an integrative online toolkit for short reads from high-throughput sequencing data.

    Science.gov (United States)

    Luo, Guan-Zheng; Yang, Wei; Ma, Ying-Ke; Wang, Xiu-Jie

    2014-02-01

    Integrative Short Reads NAvigator (ISRNA) is an online toolkit for analyzing high-throughput small RNA sequencing data. Besides the high-speed genome mapping function, ISRNA provides statistics for genomic location, length distribution and nucleotide composition bias analysis of sequence reads. Number of reads mapped to known microRNAs and other classes of short non-coding RNAs, coverage of short reads on genes, expression abundance of sequence reads as well as some other analysis functions are also supported. The versatile search functions enable users to select sequence reads according to their sub-sequences, expression abundance, genomic location, relationship to genes, etc. A specialized genome browser is integrated to visualize the genomic distribution of short reads. ISRNA also supports management and comparison among multiple datasets. ISRNA is implemented in Java/C++/Perl/MySQL and can be freely accessed at http://omicslab.genetics.ac.cn/ISRNA/.

  17. The infancy of particle accelerators life and work of Rolf Widerøe

    CERN Document Server

    1994-01-01

    The following autobiographical account of Rolf Wideröe's life and work is based on manuscripts and letters written by hirnself, most ofthem especially for this report. Data from audio and video recordings with his illustrations and from my notes taken during aseries ofmeetings between the two ofus were also included. Rolf Wideröe gave me access to many of his publications and to other documents from which I have extracted further information. I have compiled, edited and, where necessary, put the texts in chronological order. These were then corrected and supplemented by Rolf Wideröe during the course of several readings. The English translation was also checked by Wideröe and we were able to add some improvements and corrections. This account there­ fore stands as an authorised biography and is written in the first person. Mrs. Wideröe's accurate memory was of great assistance. The emphasis has been on RolfWideröe's life story and the first developments which led to modem particle accelerators. Techni�...

  18. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  19. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  20. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  1. Toolkit for healthcare facility design evaluation - some case studies

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  2. Toolkit for healthcare facility design evaluation - some case studies.

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  3. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    Science.gov (United States)

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  5. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    Science.gov (United States)

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  6. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  7. Google Web Toolkit for Ajax

    CERN Document Server

    Perry, Bruce

    2007-01-01

    The Google Web Toolkit (GWT) is a nifty framework that Java programmers can use to create Ajax applications. The GWT allows you to create an Ajax application in your favorite IDE, such as IntelliJ IDEA or Eclipse, using paradigms and mechanisms similar to programming a Java Swing application. After you code the application in Java, the GWT's tools generate the JavaScript code the application needs. You can also use typical Java project tools such as JUnit and Ant when creating GWT applications. The GWT is a free download, and you can freely distribute the client- and server-side code you c

  8. Baffles Promote Wider, Thinner Silicon Ribbons

    Science.gov (United States)

    Seidensticker, Raymond G.; Mchugh, James P.; Hundal, Rolv; Sprecace, Richard P.

    1989-01-01

    Set of baffles just below exit duct of silicon-ribbon-growing furnace reduces thermal stresses in ribbons so wider ribbons grown. Productivity of furnace increased. Diverts plume of hot gas from ribbon and allows cooler gas from top of furnace to flow around. Also shields ribbon from thermal radiation from hot growth assembly. Ribbon cooled to lower temperature before reaching cooler exit duct, avoiding abrupt drop in temperature as entering duct.

  9. Developing Climate Resilience Toolkit Decision Support Training Sectio

    Science.gov (United States)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  10. A survey exploring National Health Service ePrescribing Toolkit use and perceived usefulness amongst English hospitals

    Directory of Open Access Journals (Sweden)

    Kathrin Cresswell

    2017-06-01

    Conclusions: Interactive elements and learning lessons from early adopter sites that had accumulated experiences of implementing systems was viewed as the most helpful aspect of the ePrescribing Toolkit. The Toolkit now needs to be further developed to facilitate the continuing implementation/optimisation of ePrescribing and other health information technology across the NHS.

  11. Implementation of the Good School Toolkit in Uganda: a quantitative process evaluation of a successful violence prevention program.

    Science.gov (United States)

    Knight, Louise; Allen, Elizabeth; Mirembe, Angel; Nakuti, Janet; Namy, Sophie; Child, Jennifer C; Sturgess, Joanna; Kyegombe, Nambusi; Walakira, Eddy J; Elbourne, Diana; Naker, Dipak; Devries, Karen M

    2018-05-09

    The Good School Toolkit, a complex behavioural intervention designed by Raising Voices a Ugandan NGO, reduced past week physical violence from school staff to primary students by an average of 42% in a recent randomised controlled trial. This process evaluation quantitatively examines what was implemented across the twenty-one intervention schools, variations in school prevalence of violence after the intervention, factors that influence exposure to the intervention and factors associated with students' experience of physical violence from staff at study endline. Implementation measures were captured prospectively in the twenty-one intervention schools over four school terms from 2012 to 2014 and Toolkit exposure captured in the student (n = 1921) and staff (n = 286) endline cross-sectional surveys in 2014. Implementation measures and the prevalence of violence are summarised across schools and are assessed for correlation using Spearman's Rank Correlation Coefficient. Regression models are used to explore individual factors associated with Toolkit exposure and with physical violence at endline. School prevalence of past week physical violence from staff against students ranged from 7% to 65% across schools at endline. Schools with higher mean levels of teacher Toolkit exposure had larger decreases in violence during the study. Students in schools categorised as implementing a 'low' number of program school-led activities reported less exposure to the Toolkit. Higher student Toolkit exposure was associated with decreased odds of experiencing physical violence from staff (OR: 0.76, 95%CI: 0.67-0.86, p-valueEffectiveness of the Toolkit may be increased by further targeting and supporting teachers' engagement with girls and students with mental health difficulties. The trial is registered at clinicaltrials.gov , NCT01678846, August 24th 2012.

  12. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    Science.gov (United States)

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  13. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  14. Computational Chemistry Toolkit for Energetic Materials Design

    Science.gov (United States)

    2006-11-01

    industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6

  15. The Insight ToolKit Image Registration Framework

    Directory of Open Access Journals (Sweden)

    Brian eAvants

    2014-04-01

    Full Text Available Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit ( ITK4 seeks to es- tablish new standards in publicly available image registration methodology. ITK4 makes severaladvances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field and low-dimensional (affine transformations with metrics that are reusable across transform types and with com- posite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available.Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations versus translations. A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher tomore easily focus on design/comparison of registration strategies. In total, the ITK4 contribu- tion is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextu- alize this work with a reference registration evaluation study with application to pediatric brainlabeling.

  16. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  17. Risk assessment of chemicals in foundries: The International Chemical Toolkit pilot-project

    International Nuclear Information System (INIS)

    Ribeiro, Marcela G.; Filho, Walter R.P.

    2006-01-01

    In Brazil, problems regarding protection from hazardous substances in small-sized enterprises are similar to those observed in many other countries. Looking for a simple tool to assess and control such exposures, FUNDACENTRO has started in 2005 a pilot-project to implement the International Chemical Control Toolkit. During the series of visits to foundries, it was observed that although many changes have occurred in foundry technology, occupational exposures to silica dust and metal fumes continue to occur, due to a lack of perception of occupational exposure in the work environment. After introducing the Chemical Toolkit concept to the foundry work group, it was possible to show that the activities undertaken to improve the management of chemicals, according to its concept, will support companies in fulfilling government legislations related to chemical management, occupational health and safety, and environmental impact. In the following meetings, the foundry work group and FUNDACENTRO research team will identify 'inadequate work situations'. Based on the Chemical Toolkit, improvement measures will be proposed. Afterwards, a survey will verify the efficency of those measures in the control of hazards and consequently on the management of chemicals. This step is now in course

  18. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    Science.gov (United States)

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    Science.gov (United States)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  20. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    Directory of Open Access Journals (Sweden)

    Wei Shen

    Full Text Available FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  1. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  2. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  3. The IGUANA interactive graphics toolkit with examples from CMS and D0

    International Nuclear Information System (INIS)

    Alverson, G.; Osborne, I.; Taylor, L.; Tuura, L.

    2001-01-01

    IGUANA (Interactive Graphics for User ANAlysis) is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications, such as data browsers and detector and event visualisation programs. The IGUANA strategy is to use freely available software (e.g. Qt, SoQt, OpenInventor, OpenGL, HEPVis) and package and extend it to provide a general-purpose and experiment-independent toolkit. The authors describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA. The authors demonstrate the use of IGUANA with several applications built for CMS and D0

  4. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    1997-01-01

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  5. Improving safety on rural local and tribal roads safety toolkit.

    Science.gov (United States)

    2014-08-01

    Rural roadway safety is an important issue for communities throughout the country and presents a challenge for state, local, and Tribal agencies. The Improving Safety on Rural Local and Tribal Roads Safety Toolkit was created to help rural local ...

  6. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  7. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  8. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  9. Transition Toolkit 3.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System. Third Edition

    Science.gov (United States)

    Clark, Heather Griller; Mathur, Sarup; Brock, Leslie; O'Cummings, Mindee; Milligan, DeAngela

    2016-01-01

    The third edition of the National Technical Assistance Center for the Education of Neglected or Delinquent Children and Youth's (NDTAC's) "Transition Toolkit" provides updated information on existing policies, practices, strategies, and resources for transition that build on field experience and research. The "Toolkit" offers…

  10. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  11. GENFIT - a generic track-fitting toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Johannes [Technische Universitaet Muenchen (Germany); Schlueter, Tobias [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2014-07-01

    GENFIT is an experiment-independent track-fitting toolkit, which combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation and alignment.

  12. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  13. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Bowd

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  14. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    Science.gov (United States)

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  15. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    Science.gov (United States)

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  16. Development of an evidence-informed leisure time physical activity resource for adults with spinal cord injury: the SCI Get Fit Toolkit.

    Science.gov (United States)

    Arbour-Nicitopoulos, K P; Martin Ginis, K A; Latimer-Cheung, A E; Bourne, C; Campbell, D; Cappe, S; Ginis, S; Hicks, A L; Pomerleau, P; Smith, K

    2013-06-01

    To systematically develop an evidence-informed leisure time physical activity (LTPA) resource for adults with spinal cord injury (SCI). Canada. The Appraisal of Guidelines, Research and Evaluation (AGREE) II protocol was used to develop a toolkit to teach and encourage adults with SCI how to make smart and informed choices about being physically active. A multidisciplinary expert panel appraised the evidence and generated specific recommendations for the content of the toolkit. Pilot testing was conducted to refine the toolkit's presentation. Recommendations emanating from the consultation process were that the toolkit be a brief, evidence-based resource that contains images of adults with tetraplegia and paraplegia, and links to more detailed online information. The content of the toolkit should include the physical activity guidelines (PAGs) for adults with SCI, activities tailored to manual and power chair users, the benefits of LTPA, and strategies to overcome common LTPA barriers for adults with SCI. The inclusion of action plans and safety tips was also recommended. These recommendations have resulted in the development of an evidence-informed LTPA resource to assist adults with SCI in meeting the PAGs. This toolkit will have important implications for consumers, health care professionals and policy makers for encouraging LTPA in the SCI community.

  17. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  18. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  19. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    Science.gov (United States)

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  20. Communicating space weather to policymakers and the wider public

    Science.gov (United States)

    Ferreira, Bárbara

    2014-05-01

    As a natural hazard, space weather has the potential to affect space- and ground-based technological systems and cause harm to human health. As such, it is important to properly communicate this topic to policymakers and the general public alike, informing them (without being unnecessarily alarmist) about the potential impact of space-weather phenomena and how these can be monitored and mitigated. On the other hand, space weather is related to interesting phenomena on the Sun such as coronal-mass ejections, and incorporates one of the most beautiful displays in the Earth and its nearby space environment: aurora. These exciting and fascinating aspects of space weather should be cultivated when communicating this topic to the wider public, particularly to younger audiences. Researchers have a key role to play in communicating space weather to both policymakers and the wider public. Space scientists should have an active role in informing policy decisions on space-weather monitoring and forecasting, for example. And they can exercise their communication skills by talking about space weather to school children and the public in general. This presentation will focus on ways to communicate space weather to wider audiences, particularly policymakers. It will also address the role researchers can play in this activity to help bridge the gap between the space science community and the public.

  1. An Ethical Toolkit for Food Companies: Reflection on its Use

    NARCIS (Netherlands)

    Deblonde, M.K.; Graaff, R.; Brom, F.W.A.

    2007-01-01

    Nowadays many debates are going on that relate to the agricultural and food sector. It looks as if present technological and organizational developments within the agricultural and food sector are badly geared to societal needs and expectations. In this article we briefly present a toolkit for moral

  2. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    Science.gov (United States)

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  3. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  4. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  5. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  6. Improving the fundamentals of care for older people in the acute hospital setting: facilitating practice improvement using a Knowledge Translation Toolkit.

    Science.gov (United States)

    Wiechula, Rick; Kitson, Alison; Marcoionni, Danni; Page, Tammy; Zeitz, Kathryn; Silverston, Heidi

    2009-12-01

    with sufficient flexibility to meet the individual needs of the teams. Conclusions  The range of tools in the KT Toolkit were found to be helpful, but not all tools needed to be used to achieve successful results. Facilitation of the teams was a central feature of the KT Toolkit and allowed clinicians to retain control of their projects; however, finding the balance between structuring the process and enabling teams to maintain ownership and control was an ongoing challenge. Clinicians may not have the requisite skills and experience in basic standard setting, audit and evaluation and it was therefore important to address this throughout the project. In time this builds capacity throughout the organisation. Identifying evidence to support practice is a challenge to clinicians. Evidence-based guidelines often lack specificity and were found to be difficult to assimilate easily into everyday practice. Evidence to inform practice needs to be provided in a variety of forms and formats that allow clinicians to easily identify the source of the evidence and then develop local standards specific to their needs. The work that began with this project will continue - all teams felt that the work was only starting rather than concluding. This created momentum, motivation and greater ownership of improvements at local level. © 2009 The Authors. Journal Compilation © Blackwell Publishing Asia Pty Ltd.

  7. A patient and public involvement (PPI) toolkit for meaningful and flexible involvement in clinical trials - a work in progress.

    Science.gov (United States)

    Bagley, Heather J; Short, Hannah; Harman, Nicola L; Hickey, Helen R; Gamble, Carrol L; Woolfall, Kerry; Young, Bridget; Williamson, Paula R

    2016-01-01

    Funders of research are increasingly requiring researchers to involve patients and the public in their research. Patient and public involvement (PPI) in research can potentially help researchers make sure that the design of their research is relevant, that it is participant friendly and ethically sound. Using and sharing PPI resources can benefit those involved in undertaking PPI, but existing PPI resources are not used consistently and this can lead to duplication of effort. This paper describes how we are developing a toolkit to support clinical trials teams in a clinical trials unit. The toolkit will provide a key 'off the shelf' resource to support trial teams with limited resources, in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: ● listen to the views and experience of both research teams and patient and public contributors who use the tools; ● modify the tools based on our experience of using them; ● identify the need for future tools; ● update the toolkit based on any newly identified resources that come to light; ● raise awareness of the toolkit and ● work in collaboration with others to either develop or test out PPI resources in order to reduce duplication of work in PPI. Background Patient and public involvement (PPI) in research is increasingly a funder requirement due to the potential benefits in the design of relevant, participant friendly, ethically sound research. The use and sharing of resources can benefit PPI, but available resources are not consistently used leading to duplication of effort. This paper describes a developing toolkit to support clinical trials teams to undertake effective and meaningful PPI. Methods The first phase in developing the toolkit was to describe which PPI activities should be considered in the pathway of a clinical trial and at what stage these activities should take place. This pathway was informed through review of the type and timing of PPI activities within

  8. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  10. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  11. DUL Radio: A light-weight, wireless toolkit for sketching in hardware

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2011-01-01

    -mobile prototyping where fast reaction is needed (e.g. in controlling sound). The target audiences include designers, students, artists etc. with minimal programming and hardware skills. This presentation covers our motivations for creating the toolkit, specifications, test results, comparison to related products...

  12. Status of the petroleum pollution in the Wider Caribbean Sea

    Energy Technology Data Exchange (ETDEWEB)

    Botello, Alfonso V; Villanueva F, Susana [Universidad Nacional Autonoma de Mexico, Mexico City (Mexico). Inst. de Ciencias del Mar y Limnologia

    1996-07-01

    In 1976, the IOC-UNESCO and UNEP convened a meeting in Port of Spain to analyze the marine pollution problems in the region and noted that petroleum pollution was of region-wide concern and recommended to initiate a research and monitoring program to determine the severity of the problem and monitor its effects. Actually, the Wider Caribbean is potentially one of the largest oil producing areas in the world. Major production sites include Louisiana and Texas; USA; the Bay of Campeche, Mexico; Lake Maracaibo, Venezuela; and the Gulf of Paria, Trinidad; all which are classified as production accident high-risk zones. Main sources of petroleum pollution in the Wider Caribbean are: production, exploitation, transportation, urban and municipal discharges, refining and chemical wastes, normal loading operations and accidental spills. About 5 million of barrels are transported daily in the Caribbean, thus generating an intense tanker traffic. It has been estimated that oil discharges from tank washings within the Wider Caribbean could be as high as 7 millions barrels/year. The results of the CARIPOL Regional Programme conducted between 1980-1987 pointed out that a significant levels of petroleum pollution exists throughout the Wider Caribbean and include serious tar contamination of windward exposed beaches, high levels of floating tar within the major currents system and very high levels of dissolved/dispersed hydrocarbons in surface waters. Major effects of this petroleum pollution include: high tar level on many beaches that either prevent recreational use or require very expensive clean-up operations, distress and death to marine life and responses in the enzyme systems of marine organisms that have been correlated with declines in reproductive success. Finally the presence of polycyclic aromatic hydrocarbons in tissues of important economic species have been reported with its potential carcinogenic effects. (author)

  13. Status of the petroleum pollution in the Wider Caribbean Sea

    International Nuclear Information System (INIS)

    Botello, Alfonso V.; Villanueva F, Susana

    1996-01-01

    In 1976, the IOC-UNESCO and UNEP convened a meeting in Port of Spain to analyze the marine pollution problems in the region and noted that petroleum pollution was of region-wide concern and recommended to initiate a research and monitoring program to determine the severity of the problem and monitor its effects. Actually, the Wider Caribbean is potentially one of the largest oil producing areas in the world. Major production sites include Louisiana and Texas; USA; the Bay of Campeche, Mexico; Lake Maracaibo, Venezuela; and the Gulf of Paria, Trinidad; all which are classified as production accident high-risk zones. Main sources of petroleum pollution in the Wider Caribbean are: production, exploitation, transportation, urban and municipal discharges, refining and chemical wastes, normal loading operations and accidental spills. About 5 million of barrels are transported daily in the Caribbean, thus generating an intense tanker traffic. It has been estimated that oil discharges from tank washings within the Wider Caribbean could be as high as 7 millions barrels/year. The results of the CARIPOL Regional Programme conducted between 1980-1987 pointed out that a significant levels of petroleum pollution exists throughout the Wider Caribbean and include serious tar contamination of windward exposed beaches, high levels of floating tar within the major currents system and very high levels of dissolved/dispersed hydrocarbons in surface waters. Major effects of this petroleum pollution include: high tar level on many beaches that either prevent recreational use or require very expensive clean-up operations, distress and death to marine life and responses in the enzyme systems of marine organisms that have been correlated with declines in reproductive success. Finally the presence of polycyclic aromatic hydrocarbons in tissues of important economic species have been reported with its potential carcinogenic effects. (author)

  14. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  15. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  16. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    Science.gov (United States)

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.

  17. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  18. Report of the Los Alamos accelerator automation application toolkit workshop

    International Nuclear Information System (INIS)

    Clout, P.; Daneels, A.

    1990-01-01

    A 5 day workshop was held in November 1988 at Los Alamos National Laboratory to address the viability of providing a toolkit optimized for building accelerator control systems. The workshop arose from work started independently at Los Alamos and CERN. This paper presents the discussion and the results of the meeting. (orig.)

  19. Livermore Big Artificial Neural Network Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  20. Sierra Toolkit Manual Version 4.48.

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Toolkit Team

    2018-03-01

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall out of date. This page intentionally left blank.

  1. RAVE-a Detector-independent vertex reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, Wolfgang [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at; Mitaroff, Winfried; Moser, Fabian [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)

    2007-10-21

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  2. Tips from the toolkit: 1 - know yourself.

    Science.gov (United States)

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  3. RAVE-a Detector-independent vertex reconstruction toolkit

    International Nuclear Information System (INIS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-01-01

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available

  4. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    Science.gov (United States)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  5. Determination of Equine Cytochrome c Backbone Amide Hydrogen/Deuterium Exchange Rates by Mass Spectrometry Using a Wider Time Window and Isotope Envelope.

    Science.gov (United States)

    Hamuro, Yoshitomo

    2017-03-01

    A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.

  6. Assessing the wider environmental value of remediating land contamination

    NARCIS (Netherlands)

    Bardos, R.P.; Kearney, T.E.; Nathanail, C.P.; Weenk, A.; Martin, I.D.

    2000-01-01

    The aim of this paper is to consider qualitative and quantitative approaches for assessing the wider environmental value of remediating land contamination. In terms of the environmental element of sustainable development, a remediation project's overall environmental performance is the sum of the

  7. Strengthening Coastal Pollution Management in the Wider Caribbean Region

    NARCIS (Netherlands)

    Lavieren, van H.; Metcalfe, C.D.; Drouillard, K.; Sale, P.; Gold-Bouchot, G.; Reid, R.; Vermeulen, L.C.

    2011-01-01

    Control of aquatic pollution is critical for improving coastal zone management and for the conservation of fisheries resources. Countries in the Wider Caribbean Region (WCR) generally lack monitoring capacity and do not have reliable information on the levels and distribution of pollutants,

  8. BioWarehouse: a bioinformatics database warehouse toolkit.

    Science.gov (United States)

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  9. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  10. A universal postprocessing toolkit for accelerator simulation and data analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1998-01-01

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation

  11. Making Schools the Model for Healthier Environments Toolkit: What It Is

    Science.gov (United States)

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  12. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  13. Falling Less in Kansas: Development of a Fall Risk Reduction Toolkit

    Directory of Open Access Journals (Sweden)

    Teresa S. Radebaugh

    2011-01-01

    Full Text Available Falls are a serious health risk for older adults. But for those living in rural and frontier areas of the USA, the risks are higher because of limited access to health care providers and resources. This study employed a community-based participatory research approach to develop a fall prevention toolkit to be used by residents of rural and frontier areas without the assistance of health care providers. Qualitative data were gathered from both key informant interviews and focus groups with a broad range of participants. Data analysis revealed that to be effective and accepted, the toolkit should be not only evidence based but also practical, low-cost, self-explanatory, and usable without the assistance of a health care provider. Materials must be engaging, visually interesting, empowering, sensitive to reading level, and appropriate for low-vision users. These findings should be useful to other researchers developing education and awareness materials for older adults in rural areas.

  14. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  15. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  16. An Extended Design of the "Grid-Enabled SEE++ System" Based on Globus Toolkit 4 and gLite Conference

    CERN Document Server

    Schreiner, W.; Buchberger, M.; Kaltofen, T.

    2006-01-01

    "Grid-Enabled SEE++" based on the SEE++ software system for the biomechanical 3D simulation of the human eye and its muscles. SEE++ simulates the common eye muscle surgery techniques in a graphic interactive way that is familiar to an experienced surgeon. The goal of "Grid-Enabled SEE++" is to adapt and to extend SEE++ in several steps and to develop an efficient grid-based tool for "Evidence Based Medicine", which supports the surgeons in choosing optimal surgery techniques for the treatments of different syndromes of strabismus. In our previous work, we combined the SEE++ software with the Globus (pre-Web Service) middleware and developed a parallel version of the simulation of the "Hess-Lancaster test" (typical medical examination). By this, we demonstrated how a noticeable speedup can be achieved in SEE++ by the exploitation of the computational power of the Grid. Furthermore, we reported the prototype implementation of a medical database component for "Grid-Enabled SEE++". Finally, we designed a so calle...

  17. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL

    Directory of Open Access Journals (Sweden)

    Walsh Carolyn O

    2010-03-01

    Full Text Available Abstract Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672

  18. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    Science.gov (United States)

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  19. The Python ARM Radar Toolkit (Py-ART, a Library for Working with Weather Radar Data in the Python Programming Language

    Directory of Open Access Journals (Sweden)

    Jonathan J Helmus

    2016-07-01

    Full Text Available The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. The source code for the toolkit is available on GitHub and is distributed under a BSD license.

  20. Innovations in oral health: A toolkit for interprofessional education.

    Science.gov (United States)

    Dolce, Maria C; Parker, Jessica L; Werrlein, Debra T

    2017-05-01

    The integration of oral health competencies into non-dental health professions curricula can serve as an effective driver for interprofessional education (IPE). The purpose of this report is to describe a replicable oral-health-driven IPE model and corresponding online toolkit, both of which were developed as part of the Innovations in Oral Health (IOH): Technology, Instruction, Practice, and Service programme at Bouvé College of Health Sciences, Northeastern University, USA. Tooth decay is a largely preventable disease that is connected to overall health and wellness, and it affects the majority of adults and a fifth of children in the United States. To prepare all health professionals to address this problem, the IOH model couples programming from the online resource Smiles for Life: A National Oral Health Curriculum with experiential learning opportunities designed for undergraduate and graduate students that include simulation-learning (technology), hands-on workshops and didactic sessions (instruction), and opportunities for both cooperative education (practice) and community-based learning (service). The IOH Toolkit provides the means for others to replicate portions of the IOH model or to establish a large-scale IPE initiative that will support the creation of an interprofessional workforce-one equipped with oral health competencies and ready for collaborative practice.

  1. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  2. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  3. SwingStates: adding state machines to the swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2006-01-01

    International audience; This article describes SwingStates, a library that adds state machines to the Java Swing user interface toolkit. Unlike traditional approaches, which use callbacks or listeners to define interaction, state machines provide a powerful control structure and localize all of the interaction code in one place. SwingStates takes advantage of Java's inner classes, providing programmers with a natural syntax and making it easier to follow and debug the resulting code. SwingSta...

  4. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  5. Wider Opportunities for Women Nontraditional Work Programs: A Guide.

    Science.gov (United States)

    Wider Opportunities for Women, Inc., Washington, DC.

    Since 1970, Wider Opportunities for Women (WOW), in Washington, D.C., has conducted programs to train and place disadvantaged women in nontraditional jobs. The results have been record-breaking: high placement rates, high job retention rates, good starting salaries, and upward mobility for women who seemed doomed to a life of poverty and…

  6. BioWarehouse: a bioinformatics database warehouse toolkit

    Directory of Open Access Journals (Sweden)

    Stringer-Calvert David WJ

    2006-03-01

    Full Text Available Abstract Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the

  7. Brownfields to green fields: Realising wider benefits from practical contaminant phytomanagement strategies.

    Science.gov (United States)

    Cundy, A B; Bardos, R P; Puschenreiter, M; Mench, M; Bert, V; Friesl-Hanl, W; Müller, I; Li, X N; Weyens, N; Witters, N; Vangronsveld, J

    2016-12-15

    Gentle remediation options (GROs) are risk management strategies or technologies involving plant (phyto-), fungi (myco-), and/or bacteria-based methods that result in a net gain (or at least no gross reduction) in soil function as well as effective risk management. GRO strategies can be customised along contaminant linkages, and can generate a range of wider economic, environmental and societal benefits in contaminated land management (and in brownfields management more widely). The application of GROs as practical on-site remedial solutions is still limited however, particularly in Europe and at trace element (typically metal and metalloid) contaminated sites. This paper discusses challenges to the practical adoption of GROs in contaminated land management, and outlines the decision support tools and best practice guidance developed in the European Commission FP7-funded GREENLAND project aimed at overcoming these challenges. The GREENLAND guidance promotes a refocus from phytoremediation to wider GROs- or phyto-management based approaches which place realisation of wider benefits at the core of site design, and where gentle remediation technologies can be applied as part of integrated, mixed, site risk management solutions or as part of "holding strategies" for vacant sites. The combination of GROs with renewables, both in terms of biomass generation but also with green technologies such as wind and solar power, can provide a range of economic and other benefits and can potentially support the return of low-level contaminated sites to productive usage, while combining GROs with urban design and landscape architecture, and integrating GRO strategies with sustainable urban drainage systems and community gardens/parkland (particularly for health and leisure benefits), has large potential for triggering GRO application and in realising wider benefits in urban and suburban systems. Quantifying these wider benefits and value (above standard economic returns) will be

  8. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Science.gov (United States)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  9. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  10. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  11. A framework for a teaching toolkit in entrepreneurship education.

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  12. OGSA Globus Toolkits evaluation activity at CERN

    CERN Document Server

    Chen, D; Foster, D; Kalyaev, V; Kryukov, A; Lamanna, M; Pose, V; Rocha, R; Wang, C

    2004-01-01

    An Open Grid Service Architecture (OGSA) Globus Toolkit 3 (GT3) evaluation group is active at CERN since GT3 was available in early beta version (Spring 2003). This activity focuses on the evaluation of the technology as promised by the OGSA/OGSI paradigm and on GT3 in particular. The goal is to study this new technology and its implications with the goal to provide useful input for the large grid initiatives active in the LHC Computing Grid (LCG) project. A particular effort has been devoted to investigate performance and deployment issues, having in mind the LCG requirements, in particular scalability and robustness.

  13. REST: a toolkit for resting-state functional magnetic resonance imaging data processing.

    Directory of Open Access Journals (Sweden)

    Xiao-Wei Song

    Full Text Available Resting-state fMRI (RS-fMRI has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST. REST was developed in MATLAB with graphical user interface (GUI. After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF, and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.. REST is an open-source package and is freely available at http://www.restfmri.net.

  14. Developing a proxy version of the Adult social care outcome toolkit (ASCOT).

    Science.gov (United States)

    Rand, Stacey; Caiels, James; Collins, Grace; Forder, Julien

    2017-05-19

    Social care-related quality of life is a key outcome indicator used in the evaluation of social care interventions and policy. It is not, however, always possible to collect quality of life data by self-report even with adaptations for people with cognitive or communication impairments. A new proxy-report version of the Adult Social Care Outcomes Toolkit (ASCOT) measure of social care-related quality of life was developed to address the issues of wider inclusion of people with cognitive or communication difficulties who may otherwise be systematically excluded. The development of the proxy-report ASCOT questionnaire was informed by literature review and earlier work that identified the key issues and challenges associated with proxy-reported outcomes. To evaluate the acceptability and content validity of the ASCOT-Proxy, qualitative cognitive interviews were conducted with unpaid carers or care workers of people with cognitive or communication impairments. The proxy respondents were invited to 'think aloud' while completing the questionnaire. Follow-up probes were asked to elicit further detail of the respondent's comprehension of the format, layout and content of each item and also how they weighed up the options to formulate a response. A total of 25 unpaid carers and care workers participated in three iterative rounds of cognitive interviews. The findings indicate that the items were well-understood and the concepts were consistent with the item definitions for the standard self-completion version of ASCOT with minor modifications to the draft ASCOT-Proxy. The ASCOT-Proxy allows respondents to rate the proxy-proxy and proxy-patient perspectives, which improved the acceptability of proxy report. A new proxy-report version of ASCOT was developed with evidence of its qualitative content validity and acceptability. The ASCOT-Proxy is ready for empirical testing of its suitability for data collection as a self-completion and/or interview questionnaire, and also

  15. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  16. A Teacher Tablet Toolkit to meet the challenges posed by 21st ...

    African Journals Online (AJOL)

    The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design ...

  17. A toolkit for computerized operating procedure of complex industrial systems with IVI-COM technology

    International Nuclear Information System (INIS)

    Zhou Yangping; Dong Yujie; Huang Xiaojing; Ye Jingliang; Yoshikawa, Hidekazu

    2013-01-01

    A human interface toolkit is proposed to help the user develop computerized operating procedure of complex industrial system such as Nuclear Power Plants (NPPs). Coupled with a friendly graphical interface, this integrated tool includes a database, a procedure editor and a procedure executor. A three layer hierarchy is adopted to express the complexity of operating procedure, which includes mission, process and node. There are 10 kinds of node: entrance, exit, hint, manual input, detector, actuator, data treatment, branch, judgment and plug-in. The computerized operating procedure will sense and actuate the actual industrial systems with the interface based on IVI-COM (Interchangeable Virtual Instrumentation-Component Object Model) technology. A prototype system of this human interface toolkit has been applied to develop a simple computerized operating procedure for a simulated NPP. (author)

  18. PS1-29: Resources to Facilitate Multi-site Collaboration: the PRIMER Research Toolkit

    Science.gov (United States)

    Greene, Sarah; Thompson, Ella; Baldwin, Laura-Mae; Neale, Anne Victoria; Dolor, Rowena

    2010-01-01

    repository: www.ResearchToolkit.org, which is comprised of over 120 distinct resources. Conclusions: We are disseminating the ResearchToolkit website via academic and media channels, and identifying options for making it a sustainable resource. Given the dynamic nature of the research enterprise, maintenance and accuracy of a web-based resource is challenging. Still, the positive response to the toolkit suggests that there is high interest in sustaining it. We will demonstrate the Toolkit as part of this conference.

  19. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    Science.gov (United States)

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  20. Object Toolkit Version 4.3 User’s Manual

    Science.gov (United States)

    2016-12-31

    and with Nascap-2k. See the EPIC and Nascap-2k manuals for instructions. Most of the difficulties that users have encountered with Object Toolkit are...4/icond). 12.3 Importing Components From a NX I-DEAS TMG ASCII VUFF File Users of the NX I-DEAS TMG thermal analysis program can import the ASCII...2k user interface. The meaning of these properties is discussed in the Nascap-2k User’s Manual . Figure 36. Detector Properties Dialog Box. 15.5

  1. The MOLGENIS toolkit : rapid prototyping of biosoftware at the push of a button

    NARCIS (Netherlands)

    Swertz, Morris A.; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K.; Kanterakis, Alexandros; Roos, Erik T.; Lops, Joris; Thorisson, Gudmundur A.; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J.; de Brock, Engbert O.; Jansen, Ritsert C.; Parkinson, Helen

    2010-01-01

    Background: There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly

  2. Peer support for families of children with complex needs: Development and dissemination of a best practice toolkit.

    Science.gov (United States)

    Schippke, J; Provvidenza, C; Kingsnorth, S

    2017-11-01

    Benefits of peer support interventions for families of children with disabilities and complex medical needs have been described in the literature. An opportunity to create an evidence-informed resource to synthesize best practices in peer support for program providers was identified. The objective of this paper is to describe the key activities used to develop and disseminate the Peer Support Best Practice Toolkit. This project was led by a team of knowledge translation experts at a large pediatric rehabilitation hospital using a knowledge exchange framework. An integrated knowledge translation approach was used to engage stakeholders in the development process through focus groups and a working group. To capture best practices in peer support, a rapid evidence review and review of related resources were completed. Case studies were also included to showcase practice-based evidence. The toolkit is freely available online for download and is structured into four sections: (a) background and models of peer support, (b) case studies of programs, (c) resources, and (d) rapid evidence review. A communications plan was developed to disseminate the resource and generate awareness through presentations, social media, and champion engagement. Eight months postlaunch, the peer support website received more than 2,400 webpage hits. Early indicators suggest high relevance of this resource among stakeholders. The toolkit format was valuable to synthesize and share best practices in peer support. Strengths of the work include the integrated approach used to develop the toolkit and the inclusion of both the published research literature and experiential evidence. © 2017 John Wiley & Sons Ltd.

  3. Proceedings of the Military Operational Medicine Research Program Return to Duty (RTD) Toolkit Expert Panel Workshop, 16-17 February 2017

    Science.gov (United States)

    2017-07-10

    Laboratory 2Oak Ridge Institute for Science and Education United States Army Aeromedical Research Laboratory Aircrew Health and Performance Division...excluded from the RTD Toolkit Manual, 2) to identify any additional tasks and clinical assessments for inclusion in the RTD Toolkit Manual, and 3) to agree...for Science and Education through an interagency agreement between the U.S. Department of Energy and the U.S. Army Medical Research and Materiel

  4. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  5. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, W; Mitaroff, W; Moser, F; Pflugfelder, B; Riedel, H V [Austrian Academy of Sciences, Institute of High Energy Physics, A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at

    2008-07-15

    A detector-independent toolkit for vertex reconstruction (RAVE{sup 1}) is being developed, along with a standalone framework (VERTIGO{sup 2}) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  6. Needs assessment: blueprint for a nurse graduate orientation employer toolkit.

    Science.gov (United States)

    Cylke, Katherine

    2012-01-01

    Southern Nevada nurse employers are resistant to hiring new graduate nurses (NGNs) because of their difficulties in making the transition into the workplace. At the same time, employers consider nurse residencies cost-prohibitive. Therefore, an alternative strategy was developed to assist employers with increasing the effectiveness of existing NGN orientation programs. A needs assessment of NGNs, employers, and nursing educators was completed, and the results were used to develop a toolkit for employers.

  7. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  8. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    Science.gov (United States)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  9. Decision support toolkit for integrated analysis and design of reclaimed water infrastructure.

    Science.gov (United States)

    Lee, Eun Jung; Criddle, Craig S; Geza, Mengistu; Cath, Tzahi Y; Freyberg, David L

    2018-05-01

    Planning of water reuse systems is a complex endeavor. We have developed a software toolkit, IRIPT (Integrated Urban Reclaimed Water Infrastructure Planning Toolkit) that facilitates planning and design of reclaimed water infrastructure for both centralized and hybrid configurations that incorporate satellite treatment plants (STPs). The toolkit includes a Pipeline Designer (PRODOT) that optimizes routing and sizing of pipelines for wastewater capture and reclaimed water distribution, a Selector (SelWTP) that assembles and optimizes wastewater treatment trains, and a Calculator (CalcBenefit) that estimates fees, revenues, and subsidies of alternative designs. For hybrid configurations, a Locator (LocSTP) optimizes siting of STPs and associated wastewater diversions by identifying manhole locations where the flowrates are sufficient to ensure that wastewater extracted and treated at an adjacent STP can generate the revenue needed to pay for treatment and delivery to customers. Practical local constraints are also applied to screen and identify STP locations. Once suitable sites are selected, System Integrator (ToolIntegrator) identifies a set of centralized and hybrid configurations that: (1) maximize reclaimed water supply, (2) maximize reclaimed water supply while also ensuring a financial benefit for the system, and (3) maximize the net financial benefit for the system. The resulting configurations are then evaluated by an Analyst (SANNA) that uses monetary and non-monetary criteria, with weights assigned to appropriate metrics by a decision-maker, to identify a preferred configuration. To illustrate the structure, assumptions, and use of IRIPT, we apply it to a case study for the city of Golden, CO. The criteria weightings provided by a local decision-maker lead to a preference for a centralized configuration in this case. The Golden case study demonstrates that IRIPT can efficiently analyze centralized and hybrid water reuse configurations and rank them

  10. BIT: Biosignal Igniter Toolkit.

    Science.gov (United States)

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    International Nuclear Information System (INIS)

    Zhou, Y.; Zhu, X.; Wang, Y.; Mitra, S.

    2012-01-01

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C 3 H 6 O 6 N 6 ), can be identified to a depth of 20 cm when buried in soil. (author)

  12. Does the Good Schools Toolkit Reduce Physical, Sexual and Emotional Violence, and Injuries, in Girls and Boys equally? A Cluster-Randomised Controlled Trial.

    Science.gov (United States)

    Devries, Karen M; Knight, Louise; Allen, Elizabeth; Parkes, Jenny; Kyegombe, Nambusi; Naker, Dipak

    2017-10-01

    We aimed to investigate whether the Good School Toolkit reduced emotional violence, severe physical violence, sexual violence and injuries from school staff to students, as well as emotional, physical and sexual violence between peers, in Ugandan primary schools. We performed a two-arm cluster randomised controlled trial with parallel assignment. Forty-two schools in one district were allocated to intervention (n = 21) or wait-list control (n = 21) arms in 2012. We did cross-sectional baseline and endline surveys in 2012 and 2014, and the Good School Toolkit intervention was implemented for 18 months between surveys. Analyses were by intention to treat and are adjusted for clustering within schools and for baseline school-level proportions of outcomes. The Toolkit was associated with an overall reduction in any form of violence from staff and/or peers in the past week towards both male (aOR = 0.34, 95%CI 0.22-0.53) and female students (aOR = 0.55, 95%CI 0.36-0.84). Injuries as a result of violence from school staff were also lower in male (aOR = 0.36, 95%CI 0.20-0.65) and female students (aOR = 0.51, 95%CI 0.29-0.90). Although the Toolkit seems to be effective at reducing violence in both sexes, there is some suggestion that the Toolkit may have stronger effects in boys than girls. The Toolkit is a promising intervention to reduce a wide range of different forms of violence from school staff and between peers in schools, and should be urgently considered for scale-up. Further research is needed to investigate how the intervention could engage more successfully with girls.

  13. The Python ARM Radar Toolkit (Py-ART), a Library for Working with Weather Radar Data in the Python Programming Language

    OpenAIRE

    Helmus, Jonathan J; Collis, Scott M

    2016-01-01

    The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cy...

  14. TMVA(Toolkit for Multivariate Analysis) new architectures design and implementation.

    CERN Document Server

    Zapata Mesa, Omar Andres

    2016-01-01

    Toolkit for Multivariate Analysis(TMVA) is a package in ROOT for machine learning algorithms for classification and regression of the events in the detectors. In TMVA, we are developing new high level algorithms to perform multivariate analysis as cross validation, hyper parameter optimization, variable importance etc... Almost all the algorithms are expensive and designed to process a huge amount of data. It is very important to implement the new technologies on parallel computing to reduce the processing times.

  15. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    Science.gov (United States)

    1998-01-01

    devices are exoskeletal in nature. They could be flexible, such as a glove or a suit worn by the user, or they could be rigid, such as jointed linkages...effectiveness of using force control need to be investigated. The MAGIC Toolkit can be used to develop sensory tasks for rehabilitative medicine...display. Proceedings of IEEE Conference on Robotics and Automation, San Diego, CA, May 1994. [4] J. E. Colgate, P. E. Grafing, and M. C. Stanley

  16. Mini-grid Policy Tool-kit. Policy and business frameworks for successful mini-grid roll-outs

    International Nuclear Information System (INIS)

    Franz, Michael; Hayek, Niklas; Peterschmidt, Nico; Rohrer, Michael; Kondev, Bozhil; Adib, Rana; Cader, Catherina; Carter, Andrew; George, Peter; Gichungi, Henry; Hankins, Mark; Kappiah, Mahama; Mangwengwende, Simbarashe E.

    2014-01-01

    The Mini-grid Policy Tool-kit is for policy makers to navigate the mini-grid policy design process. It contains information on mini-grid operator models, the economics of mini-grids, and necessary policy and regulation that must be considered for successful implementation. The publication specifically focuses on Africa. Progress on extending the electricity grid in many countries has remained slow because of high costs of gird-extension and limited utility/state budgets for electrification. Mini-grids provide an affordable and cost-effective option to extend needed electricity services. Putting in place the right policy for min-grid deployment requires considerable effort but can yield significant improvement in electricity access rates as examples from Kenya, Senegal and Tanzania illustrate. The tool-kit is available in English, French and Portuguese

  17. Supporting safe driving with arthritis: developing a driving toolkit for clinical practice and consumer use.

    Science.gov (United States)

    Vrkljan, Brenda H; Cranney, Ann; Worswick, Julia; O'Donnell, Siobhan; Li, Linda C; Gélinas, Isabelle; Byszewski, Anna; Man-Son-Hing, Malcolm; Marshall, Shawn

    2010-01-01

    We conducted a series of focus groups to explore the information needs of clinicians and consumers related to arthritis and driving. An open coding analysis identified common themes across both consumer and clinician-based focus groups that underscored the importance of addressing driving-related concerns and the challenges associated with assessing safety. The results revealed that although driving is critical for maintaining independence and community mobility, drivers with arthritis experience several problems that can affect safe operation of a motor vehicle. Findings from this study are part of a broader research initiative that will inform the development of the Arthritis and Driving toolkit. This toolkit outlines strategies to support safe mobility for people with arthritis and will be an important resource in the coming years given the aging population.

  18. Application of the SHARP Toolkit to Sodium-Cooled Fast Reactor Challenge Problems

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Yu, Y. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Kim, T. K. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2017-09-30

    The Simulation-based High-efficiency Advanced Reactor Prototyping (SHARP) toolkit is under development by the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign of the U.S. Department of Energy, Office of Nuclear Energy. To better understand and exploit the benefits of advanced modeling simulations, the NEAMS Campaign initiated the “Sodium-Cooled Fast Reactor (SFR) Challenge Problems” task, which include the assessment of hot channel factors (HCFs) and the demonstration of zooming capability using the SHARP toolkit. If both challenge problems are resolved through advanced modeling and simulation using the SHARP toolkit, the economic competitiveness of a SFR can be significantly improved. The efforts in the first year of this project focused on the development of computational models, meshes, and coupling procedures for multi-physics calculations using the neutronics (PROTEUS) and thermal-hydraulic (Nek5000) components of the SHARP toolkit, as well as demonstration of the HCF calculation capability for the 100 MWe Advanced Fast Reactor (AFR-100) design. Testing the feasibility of the SHARP zooming capability is planned in FY 2018. The HCFs developed for the earlier SFRs (FFTF, CRBR, and EBR-II) were reviewed, and a subset of these were identified as potential candidates for reduction or elimination through high-fidelity simulations. A one-way offline coupling method was used to evaluate the HCFs where the neutronics solver PROTEUS computes the power profile based on an assumed temperature, and the computational fluid dynamics solver Nek5000 evaluates the peak temperatures using the neutronics power profile. If the initial temperature profile used in the neutronics calculation is reasonably accurate, the one-way offline method is valid because the neutronics power profile has weak dependence on small temperature variation. In order to get more precise results, the proper temperature profile for initial neutronics calculations was obtained from the

  19. Between structures and norms : Assessing tax increment financing for the Dutch spatial planning toolkit

    NARCIS (Netherlands)

    Root, Liz; Van Der Krabben, Erwin; Spit, Tejo

    2015-01-01

    The aim of the paper is to assess the institutional (mis)fit of tax increment financing for the Dutch spatial planning financial toolkit. By applying an institutionally oriented assessment framework, we analyse the interconnectivity of Dutch municipal finance and spatial planning structures and

  20. Prevention literacy: community-based advocacy for access and ownership of the HIV prevention toolkit.

    Science.gov (United States)

    Parker, Richard G; Perez-Brumer, Amaya; Garcia, Jonathan; Gavigan, Kelly; Ramirez, Ana; Milnor, Jack; Terto, Veriano

    2016-01-01

    Critical technological advances have yielded a toolkit of HIV prevention strategies. This literature review sought to provide contextual and historical reflection needed to bridge the conceptual gap between clinical efficacy and community effectiveness (i.e. knowledge and usage) of existing HIV prevention options, especially in resource-poor settings. Between January 2015 and October 2015, we reviewed scholarly and grey literatures to define treatment literacy and health literacy and assess the current need for literacy related to HIV prevention. The review included searches in electronic databases including MEDLINE, PsycINFO, PubMed, and Google Scholar. Permutations of the following search terms were used: "treatment literacy," "treatment education," "health literacy," and "prevention literacy." Through an iterative process of analyses and searches, titles and/or abstracts and reference lists of retrieved articles were reviewed for additional articles, and historical content analyses of grey literature and websites were additionally conducted. Treatment literacy was a well-established concept developed in the global South, which was later partially adopted by international agencies such as the World Health Organization. Treatment literacy emerged as more effective antiretroviral therapies became available. Developed from popular pedagogy and grassroots efforts during an intense struggle for treatment access, treatment literacy addressed the need to extend access to underserved communities and low-income settings that might otherwise be excluded from access. In contrast, prevention literacy is absent in the recent surge of new biomedical prevention strategies; prevention literacy was scarcely referenced and undertheorized in the available literature. Prevention efforts today include multimodal techniques, which jointly comprise a toolkit of biomedical, behavioural, and structural/environmental approaches. However, linkages to community advocacy and mobilization

  1. A cost effective and high fidelity fluoroscopy simulator using the Image-Guided Surgery Toolkit (IGSTK)

    Science.gov (United States)

    Gong, Ren Hui; Jenkins, Brad; Sze, Raymond W.; Yaniv, Ziv

    2014-03-01

    The skills required for obtaining informative x-ray fluoroscopy images are currently acquired while trainees provide clinical care. As a consequence, trainees and patients are exposed to higher doses of radiation. Use of simulation has the potential to reduce this radiation exposure by enabling trainees to improve their skills in a safe environment prior to treating patients. We describe a low cost, high fidelity, fluoroscopy simulation system. Our system enables operators to practice their skills using the clinical device and simulated x-rays of a virtual patient. The patient is represented using a set of temporal Computed Tomography (CT) images, corresponding to the underlying dynamic processes. Simulated x-ray images, digitally reconstructed radiographs (DRRs), are generated from the CTs using ray-casting with customizable machine specific imaging parameters. To establish the spatial relationship between the CT and the fluoroscopy device, the CT is virtually attached to a patient phantom and a web camera is used to track the phantom's pose. The camera is mounted on the fluoroscope's intensifier and the relationship between it and the x-ray source is obtained via calibration. To control image acquisition the operator moves the fluoroscope as in normal operation mode. Control of zoom, collimation and image save is done using a keypad mounted alongside the device's control panel. Implementation is based on the Image-Guided Surgery Toolkit (IGSTK), and the use of the graphics processing unit (GPU) for accelerated image generation. Our system was evaluated by 11 clinicians and was found to be sufficiently realistic for training purposes.

  2. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    Science.gov (United States)

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  3. A Toolkit of Systems Gaming Techniques

    Science.gov (United States)

    Finnigan, David; McCaughey, Jamie W.

    2017-04-01

    Decision-makers facing natural hazard crises need a broad set of cognitive tools to help them grapply with complexity. Systems gaming can act as a kind of 'flight simulator for decision making' enabling us to step through real life complex scenarios of the kind that beset us in natural disaster situations. Australian science-theatre ensemble Boho Interactive is collaborating with the Earth Observatory Singapore to develop an in-person systems game modelling an unfolding natural hazard crisis (volcanic unrest or an approaching typhoon) impacting an Asian city. Through a combination of interactive mechanisms drawn from boardgaming and participatory theatre, players will make decisions and assign resources in response to the unfolding crisis. In this performance, David Finnigan from Boho will illustrate some of the participatory techniques that Boho use to illustrate key concepts from complex systems science. These activities are part of a toolkit which can be adapted to fit a range of different contexts and scenarios. In this session, David will present short activities that demonstrate a range of systems principles including common-pool resource challenges (the Tragedy of the Commons), interconnectivity, unintended consequences, tipping points and phase transitions, and resilience. The interactive mechanisms for these games are all deliberately lo-fi rather than digital, for three reasons. First, the experience of a tactile, hands-on game is more immediate and engaging. It brings the focus of the participants into the room and facilitates engagement with the concepts and with each other, rather than with individual devices. Second, the mechanics of the game are laid bare. This is a valuable way to illustrate that complex systems are all around us, and are not merely the domain of hi-tech systems. Finally, these games can be used in a wide variety of contexts by removing computer hardware requirements and instead using materials and resources that are easily found in

  4. Clinical Trial of a Home Safety Toolkit for Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Kathy J. Horvath

    2013-01-01

    Full Text Available This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n=60 received the Home Safety Toolkit (HST, including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n=48 received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P≤0.001, caregiver strain at P≤0.001, and caregiver self-efficacy at P=0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P≤0.001. The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care.

  5. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fields, Laura [Fermilab; Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-08-21

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. This raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.

  6. The Nuclear Energy Advanced Modeling and Simulation Safeguards and Separations Reprocessing Plant Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    McCaskey, Alex [ORNL; Billings, Jay Jay [ORNL; de Almeida, Valmor F [ORNL

    2011-08-01

    This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usability and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.

  7. Economic evaluation of the Good School Toolkit: an intervention for reducing violence in primary schools in Uganda.

    Science.gov (United States)

    Greco, Giulia; Knight, Louise; Ssekadde, Willington; Namy, Sophie; Naker, Dipak; Devries, Karen

    2018-01-01

    This paper presents the cost and cost-effectiveness of the Good School Toolkit (GST), a programme aimed at reducing physical violence perpetrated by school staff to students in Uganda. The effectiveness of the Toolkit was tested with a cluster randomised controlled trial in 42 primary schools in Luwero District, Uganda. A full economic costing evaluation and cost-effectiveness analysis were conducted alongside the trial. Both financial and economic costs were collected retrospectively from the provider's perspective to estimate total and unit costs. The total cost of setting up and running the Toolkit over the 18-month trial period is estimated at US$397 233, excluding process monitor (M&E) activities. The cost to run the intervention is US$7429 per school annually, or US$15 per primary school pupil annually, in the trial intervention schools. It is estimated that the intervention has averted 1620 cases of past-week physical violence during the 18-month implementation period. The total cost per case of violence averted is US$244, and the annual implementation cost is US$96 per case averted during the trial. The GST is a cost-effective intervention for reducing violence against pupils in primary schools in Uganda. It compares favourably against other violence reduction interventions in the region.

  8. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  9. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Directory of Open Access Journals (Sweden)

    K Anderson

    Full Text Available This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app', so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping.

  10. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Science.gov (United States)

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  11. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  12. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...

  13. The Multiple-Patient Simulation Toolkit: Purpose, Process, and Pilot.

    Science.gov (United States)

    Beroz, Sabrina; Sullivan, Nancy; Kramasz, Vanessa; Morgan, Patricia

    Educating nursing students to safely care for multiple patients has become an important but challenging focus for nurse educators. New graduate nurses are expected to manage care for multiple patients in a complex and multifaceted health care system. With patient safety as a priority, multiple-patient assignments are necessary in order for nursing students to learn how to effectively prioritize and delegate care. The purpose of this project was the construction of an adaptable and flexible template for the development of multiple-patient simulations. Through utilization, the template moved to a toolkit adding an operational guide, sample-populated template, and bibliography.

  14. The Populist Toolkit : Finnish Populism in Action 2007–2016

    OpenAIRE

    Ylä-Anttila, Tuukka

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  15. The interactive learning toolkit: technology and the classroom

    Science.gov (United States)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  16. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    Science.gov (United States)

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  17. An integrative data analysis platform for gene set analysis and knowledge discovery in a data warehouse framework.

    Science.gov (United States)

    Chen, Yi-An; Tripathi, Lokesh P; Mizuguchi, Kenji

    2016-01-01

    Data analysis is one of the most critical and challenging steps in drug discovery and disease biology. A user-friendly resource to visualize and analyse high-throughput data provides a powerful medium for both experimental and computational biologists to understand vastly different biological data types and obtain a concise, simplified and meaningful output for better knowledge discovery. We have previously developed TargetMine, an integrated data warehouse optimized for target prioritization. Here we describe how upgraded and newly modelled data types in TargetMine can now survey the wider biological and chemical data space, relevant to drug discovery and development. To enhance the scope of TargetMine from target prioritization to broad-based knowledge discovery, we have also developed a new auxiliary toolkit to assist with data analysis and visualization in TargetMine. This toolkit features interactive data analysis tools to query and analyse the biological data compiled within the TargetMine data warehouse. The enhanced system enables users to discover new hypotheses interactively by performing complicated searches with no programming and obtaining the results in an easy to comprehend output format. Database URL: http://targetmine.mizuguchilab.org. © The Author(s) 2016. Published by Oxford University Press.

  18. Effect of an educational toolkit on quality of care: a pragmatic cluster randomized trial.

    Science.gov (United States)

    Shah, Baiju R; Bhattacharyya, Onil; Yu, Catherine H Y; Mamdani, Muhammad M; Parsons, Janet A; Straus, Sharon E; Zwarenstein, Merrick

    2014-02-01

    Printed educational materials for clinician education are one of the most commonly used approaches for quality improvement. The objective of this pragmatic cluster randomized trial was to evaluate the effectiveness of an educational toolkit focusing on cardiovascular disease screening and risk reduction in people with diabetes. All 933,789 people aged ≥40 years with diagnosed diabetes in Ontario, Canada were studied using population-level administrative databases, with additional clinical outcome data collected from a random sample of 1,592 high risk patients. Family practices were randomly assigned to receive the educational toolkit in June 2009 (intervention group) or May 2010 (control group). The primary outcome in the administrative data study, death or non-fatal myocardial infarction, occurred in 11,736 (2.5%) patients in the intervention group and 11,536 (2.5%) in the control group (p = 0.77). The primary outcome in the clinical data study, use of a statin, occurred in 700 (88.1%) patients in the intervention group and 725 (90.1%) in the control group (p = 0.26). Pre-specified secondary outcomes, including other clinical events, processes of care, and measures of risk factor control, were also not improved by the intervention. A limitation is the high baseline rate of statin prescribing in this population. The educational toolkit did not improve quality of care or cardiovascular outcomes in a population with diabetes. Despite being relatively easy and inexpensive to implement, printed educational materials were not effective. The study highlights the need for a rigorous and scientifically based approach to the development, dissemination, and evaluation of quality improvement interventions. http://www.ClinicalTrials.gov NCT01411865 and NCT01026688.

  19. The Wider Implications of Business-model Research

    DEFF Research Database (Denmark)

    Ritter, Thomas; Lettl, Christopher

    2018-01-01

    Business-model research has struggled to develop a clear footprint in the strategic management field. This introduction to the special issue on the wider implications of business-model research argues that part of this struggle relates to the application of five different perspectives on the term...... “business model,” which creates ambiguity about the conceptual boundaries of business models, the applied terminology, and the potential contributions of business-model research to strategic management literature. By explicitly distinguishing among these five perspectives and by aligning them into one...... overarching, comprehensive framework, this paper offers a foundation for consolidating business-model research. Furthermore, we explore the connections between business-model research and prominent theories in strategic management. We conclude that business-model research is not necessarily a “theory on its...

  20. Development of a toolkit in Silverlight to detect physiognomies in real time - doi: 10.4025/actascitechnol.v35i1.13713

    Directory of Open Access Journals (Sweden)

    Marcelo Cabral Ghilardi

    2013-01-01

    Full Text Available Security demands, forensic practices and the identification of criminals require the detection and recognition of iris and fingerprints and of faces in videos and photographs. Moreover, there is an increasing need for multiple forms of human-machine interaction. Control devices by body stimulus are a need and a trend. For example, currently some computers, laptops, phones and video games provide interaction from their cameras, not only for face detection but also for body movements and the detection of objects. Most devices are Internet accessed, which creates an even greater range of possibilities. These technological trends have prompted the development a toolkit for detecting faces in real time. The choice of Silverlight framework for the development of this toolkit provides these applications with instruments that could be implemented in a web browser. This toolkit may be used for other purposes, such as face and iris recognition, body movement and the monitoring of premises. An application was developed as example and proof of concept.  

  1. 76 FR 19380 - Notice of Entry Into Effect of MARPOL Annex V Wider Caribbean Region Special Area

    Science.gov (United States)

    2011-04-07

    ... Effect of MARPOL Annex V Wider Caribbean Region Special Area AGENCY: Coast Guard, DHS. ACTION: Notice. SUMMARY: The Coast Guard announces the date for the entry into effect of discharge requirements from ships in the Wider Caribbean Region (WCR) special area (SA) as specified in the International Convention...

  2. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    Science.gov (United States)

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  3. SwingStates: Adding state machines to Java and the Swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2008-01-01

    International audience; This article describes SwingStates, a Java toolkit designed to facilitate the development of graphical user interfaces and bring advanced interaction techniques to the Java platform. SwingStates is based on the use of finite-state machines specified directly in Java to describe the behavior of interactive systems. State machines can be used to redefine the behavior of existing Swing widgets or, in combination with a new canvas widget that features a rich graphical mode...

  4. A simulation toolkit for electroluminescence assessment in rare event experiments

    CERN Document Server

    Oliveira, C A B; Veenhof, R; Biagi, S; Monteiro, C M B; Santos, J M F dos; Ferreira, A L; Veloso, J F C A

    2011-01-01

    A good understanding of electroluminescence is a prerequisite when optimising double-phase noble gas detectors for Dark Matter searches and high-pressure xenon TPCs for neutrinoless double beta decay detection. A simulation toolkit for calculating the emission of light through electron impact on neon, argon, krypton and xenon has been developed using the Magboltz and Garfield programs. Calculated excitation and electroluminescence efficiencies, electroluminescence yield and associated statistical fluctuations are presented as a function of electric field. Good agreement with experiment and with Monte Carlo simulations has been obtained.

  5. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  6. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    Science.gov (United States)

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…

  7. An evaluation of a toolkit for the early detection, management, and control of carbapenemase-producing Enterobacteriaceae: a survey of acute hospital trusts in England.

    Science.gov (United States)

    Coope, C M; Verlander, N Q; Schneider, A; Hopkins, S; Welfare, W; Johnson, A P; Patel, B; Oliver, I

    2018-03-09

    Following hospital outbreaks of carbapenemase-producing Enterobacteriaceae (CPE), Public Health England published a toolkit in December 2013 to promote the early detection, management, and control of CPE colonization and infection in acute hospital settings. To examine awareness, uptake, implementation and usefulness of the CPE toolkit and identify potential barriers and facilitators to its adoption in order to inform future guidance. A cross-sectional survey of National Health Service (NHS) acute trusts was conducted in May 2016. Descriptive analysis and multivariable regression models were conducted, and narrative responses were analysed thematically and informed using behaviour change theory. Most (92%) acute trusts had a written CPE plan. Fewer (75%) reported consistent compliance with screening and isolation of CPE risk patients. Lower prioritization and weaker senior management support for CPE prevention were associated with poorer compliance. Awareness of the CPE toolkit was high and all trusts with patients infected or colonized with CPE had used the toolkit either as provided (32%), or to inform (65%) their own local CPE plan. Despite this, many respondents (80%) did not believe that the CPE toolkit guidance offered an effective means to prevent CPE or was practical to follow. CPE prevention and control requires robust IPC measures. Successful implementation can be hindered by a complex set of factors related to their practical execution, insufficient resources and a lack of confidence in the effectiveness of the guidance. Future CPE guidance would benefit from substantive user involvement, processes for ongoing feedback, and regular guidance updates. Copyright © 2018 The Healthcare Infection Society. All rights reserved.

  8. Midwives in medical student and resident education and the development of the medical education caucus toolkit.

    Science.gov (United States)

    Radoff, Kari; Nacht, Amy; Natch, Amy; McConaughey, Edie; Salstrom, Jan; Schelling, Karen; Seger, Suzanne

    2015-01-01

    Midwives have been involved formally and informally in the training of medical students and residents for many years. Recent reductions in resident work hours, emphasis on collaborative practice, and a focus on midwives as key members of the maternity care model have increased the involvement of midwives in medical education. Midwives work in academic settings as educators to teach the midwifery model of care, collaboration, teamwork, and professionalism to medical students and residents. In 2009, members of the American College of Nurse-Midwives formed the Medical Education Caucus (MECA) to discuss the needs of midwives teaching medical students and residents; the group has held a workshop annually over the last 4 years. In 2014, MECA workshop facilitators developed a toolkit to support and formalize the role of midwives involved in medical student and resident education. The MECA toolkit provides a roadmap for midwives beginning involvement and continuing or expanding the role of midwives in medical education. This article describes the history of midwives in medical education, the development and growth of MECA, and the resulting toolkit created to support and formalize the role of midwives as educators in medical student and resident education, as well as common challenges for the midwife in academic medicine. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. © 2015 by the American College of Nurse-Midwives.

  9. Improving primary care for persons with spinal cord injury: Development of a toolkit to guide care.

    Science.gov (United States)

    Milligan, James; Lee, Joseph; Hillier, Loretta M; Slonim, Karen; Craven, Catharine

    2018-05-07

    To identify a set of essential components for primary care for patients with spinal cord injury (SCI) for inclusion in a point-of-practice toolkit for primary care practitioners (PCP) and identification of the essential elements of SCI care that are required in primary care and those that should be the focus of specialist care. Modified Delphi consensus process; survey methodology. Primary care. Three family physicians, six specialist physicians, and five inter-disciplinary health professionals completed surveys. Importance of care elements for inclusion in the toolkit (9-point scale: 1 = lowest level of importance, 9 = greatest level of importance) and identification of most responsible physician (family physician, specialist) for completing key categories of care. Open-ended comments were solicited. There was consensus between the respondent groups on the level of importance of various care elements. Mean importance scores were highest for autonomic dysreflexia, pain, and skin care and lowest for preventive care, social issues, and vital signs. Although, there was agreement across all respondents that family physicians should assume responsibility for assessing mental health, there was variability in who should be responsible for other care categories. Comments were related to the need for shared care approaches and capacity building and lack of knowledge and specialized equipment as barriers to optimal care. This study identified important components of SCI care to be included in a point-of-practice toolkit to facilitate primary care for persons with SCI.

  10. The Wider Impacts of Universities: Habermas on Learning Processes and Universities

    Directory of Open Access Journals (Sweden)

    Jesper Eckhardt Larsen

    2013-06-01

    Full Text Available The discourse of reform in higher education tends to focus narrowly on employability and the relationship between higher education and the labor market. Universities as research institutions are now considered solely in the dominant discourse of innovation. This way of conceiving universities is inspired by functionalist theory that focuses on the imperatives of a knowledge economy. Taking a departure in the theory of society developed by Jürgen Habermas this paper seeks to provide a theoretical framework for an empirical comparative analysis on the wider societal impact of universities. It is the argument that the wider impacts of higher education and research at universities must be seen in a more complex vision of modern societies. The paper is thus primarily a re-reading of Habermas’ critique of functionalist views of the university and an application of Habermas’ critique on current issues in the debates on higher education. A special discussion will be taken on issues of the self in view of the current tendencies to regard all education from the standpoint of the economic outputs.

  11. The Special Educator's Toolkit: Everything You Need to Organize, Manage, and Monitor Your Classroom

    Science.gov (United States)

    Golden, Cindy

    2012-01-01

    Overwhelmed special educators: Reduce your stress and support student success with this practical toolkit for whole-classroom organization. A lifesaver for special educators in any K-12 setting, this book-and-CD set will help teachers expertly manage everything, from schedules and paperwork to student supports and behavior plans. Cindy Golden, a…

  12. UniSchooLabs Toolkit: Tools and Methodologies to Support the Adoption of Universities’ Remote and Virtual Labs in Schools

    Directory of Open Access Journals (Sweden)

    Augusto Chioccariello

    2012-11-01

    Full Text Available The UniSchooLabs project aims at creating an infrastructure supporting web access to remote/virtual labs and associated educational resources to engage learners with hands-on and minds-on activities in science, technology and math in schools. The UniSchooLabs tool-kit supports the teacher in selecting a remote or virtual lab and developing a lab activity based on an inquiry model template. While working with the toolkit the teacher has access to three main features: a a catalogue of available online laboratories; b an archive of activities created by other users; c a tool for creating new activities or reusing existing ones.

  13. Sustainability rating tools for buildings and its wider application

    Directory of Open Access Journals (Sweden)

    Siew Renard

    2017-01-01

    Full Text Available This paper provides a commentary on the latest research in measuring the sustainability of buildings and its wider application. The emergence of sustainability rating tools (SRTs has faced critique from scholars due to their deficiencies such as the overemphasis on environmental criteria, the negligence of uncertainty in scoring and existence of non-scientific criteria benchmarks among many others. This could have attributed to the mixed evidence in the literature on the benefits of SRTs. Future research direction is proposed to advance the state-of-the art in this field.

  14. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    Science.gov (United States)

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  15. Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications

    NARCIS (Netherlands)

    Gonsalves, Atish; Ternier, Stefaan; De Vries, Fred; Specht, Marcus

    2013-01-01

    Gonsalves, A., Ternier, S., De Vries, F., & Specht, M. (2012, 16-18 October). Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications. Presentation given at the 11th World Conference on Mobile and Contextual Learning (mLearn 2012), Helsinki, Finland.

  16. Cyber security awareness toolkit for national security: an approach to South Africa's cyber security policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives...

  17. Cyber security awareness toolkit for national security: An approach to South Africa’s cybersecurity policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed Cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives as well...

  18. Urban Teacher Academy Project Toolkit: A Guide to Developing High School Teaching Career Academies.

    Science.gov (United States)

    Berrigan, Anne; Schwartz, Shirley

    There is an urgent need not only to attract more people into the teaching profession but also to build a more diverse, highly qualified, and culturally sensitive teaching force that can meet the needs of a rapidly changing school-age population. This Toolkit takes best practices from high school teacher academies around the United States and…

  19. Software Toolkits: Practical Aspects of the Internet of Things—A Survey

    OpenAIRE

    Wang, Feng; Hu, Liang; Zhou, Jin; Wu, Yang; Hu, Jiejun; Zhao, Kuo

    2015-01-01

    The Internet of Things (IoT) is neither science fiction nor industry hype; rather it is based on solid technological advances and visions of network ubiquity that are zealously being realized. The paper serves to provide guidance regarding the practical aspects of the IoT. Such guidance is largely missing in the current literature in which the focus has been more on research problems and less on issues describing how to set up an IoT system and what software toolkits are required. This paper ...

  20. An interactive toolkit to extract phenological time series data from digital repeat photography

    Science.gov (United States)

    Seyednasrollah, B.; Milliman, T. E.; Hufkens, K.; Kosmala, M.; Richardson, A. D.

    2017-12-01

    Near-surface remote sensing and in situ photography are powerful tools to study how climate change and climate variability influence vegetation phenology and the associated seasonal rhythms of green-up and senescence. The rapidly-growing PhenoCam network has been using in situ digital repeat photography to study phenology in almost 500 locations around the world, with an emphasis on North America. However, extracting time series data from multiple years of half-hourly imagery - while each set of images may contain several regions of interest (ROI's), corresponding to different species or vegetation types - is not always straightforward. Large volumes of data require substantial processing time, and changes (either intentional or accidental) in camera field of view requires adjustment of ROI masks. Here, we introduce and present "DrawROI" as an interactive web-based application for imagery from PhenoCam. DrawROI can also be used offline, as a fully independent toolkit that significantly facilitates extraction of phenological data from any stack of digital repeat photography images. DrawROI provides a responsive environment for phenological scientists to interactively a) delineate ROIs, b) handle field of view (FOV) shifts, and c) extract and export time series data characterizing image color (i.e. red, green and blue channel digital numbers for the defined ROI). The application utilizes artificial intelligence and advanced machine learning techniques and gives user the opportunity to redraw new ROIs every time an FOV shift occurs. DrawROI also offers a quality control flag to indicate noisy data and images with low quality due to presence of foggy weather or snow conditions. The web-based application significantly accelerates the process of creating new ROIs and modifying pre-existing ROI in the PhenoCam database. The offline toolkit is presented as an open source R-package that can be used with similar datasets with time-lapse photography to obtain more data for

  1. OpenDBDDAS Toolkit: Secure MapReduce and Hadoop-like Systems

    KAUST Repository

    Fabiano, Enrico

    2015-06-01

    The OpenDBDDAS Toolkit is a software framework to provide support for more easily creating and expanding dynamic big data-driven application systems (DBDDAS) that are common in environmental systems, many engineering applications, disaster management, traffic management, and manufacturing. In this paper, we describe key features needed to implement a secure MapReduce and Hadoop-like system for high performance clusters that guarantees a certain level of privacy of data from other concurrent users of the system. We also provide examples of a secure MapReduce prototype and compare it to another high performance MapReduce, MR-MPI.

  2. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  3. The Exoplanet Characterization ToolKit (ExoCTK)

    Science.gov (United States)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  4. Use of Remote Sensing Data to Enhance the National Weather Service (NWS) Storm Damage Toolkit

    Science.gov (United States)

    Jedlovec, Gary; Molthan, Andrew; White, Kris; Burks, Jason; Stellman, Keith; Smith, Matthew

    2012-01-01

    SPoRT is improving the use of near real-time satellite data in response to severe weather events and other diasters. Supported through NASA s Applied Sciences Program. Planned interagency collaboration to support NOAA s Damage Assessment Toolkit, with spinoff opportunities to support other entities such as USGS and FEMA.

  5. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    International Nuclear Information System (INIS)

    Wei, J; Yuan, A; Li, G

    2014-01-01

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  6. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, J [City College of New York, New York, NY (United States); Yuan, A; Li, G [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  7. Information contained within the large scale gas injection test (Lasgit) dataset exposed using a bespoke data analysis tool-kit

    International Nuclear Information System (INIS)

    Bennett, D.P.; Thomas, H.R.; Cuss, R.J.; Harrington, J.F.; Vardon, P.J.

    2012-01-01

    Document available in extended abstract form only. The Large Scale Gas Injection Test (Lasgit) is a field scale experiment run by the British Geological Survey (BGS) and is located approximately 420 m underground at SKB's Aespoe Hard Rock Laboratory (HRL) in Sweden. It has been designed to study the impact on safety of gas build up within a KBS-3V concept high level radioactive waste repository. Lasgit has been in almost continuous operation for approximately seven years and is still underway. An analysis of the dataset arising from the Lasgit experiment with particular attention to the smaller scale features and phenomenon recorded has been undertaken in parallel to the macro scale analysis performed by the BGS. Lasgit is a highly instrumented, frequently sampled and long-lived experiment leading to a substantial dataset containing in excess of 14.7 million datum points. The data is anticipated to include a wealth of information, including information regarding overall processes as well as smaller scale or 'second order' features. Due to the size of the dataset coupled with the detailed analysis of the dataset required and the reduction in subjectivity associated with measurement compared to observation, computational analysis is essential. Moreover, due to the length of operation and complexity of experimental activity, the Lasgit dataset is not typically suited to 'out of the box' time series analysis algorithms. In particular, the features that are not suited to standard algorithms include non-uniformities due to (deliberate) changes in sample rate at various points in the experimental history and missing data due to hardware malfunction/failure causing interruption of logging cycles. To address these features a computational tool-kit capable of performing an Exploratory Data Analysis (EDA) on long-term, large-scale datasets with non-uniformities has been developed. Particular tool-kit abilities include: the parameterization of signal variation in the dataset

  8. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  9. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  10. Source Materials for the Healthy Communities Toolkit: A Resource Guide for Community and Faith-Based Organizations.

    Science.gov (United States)

    Acosta, Joie; Chandra, Anita; Williams, Malcolm; Davis, Lois M

    2011-01-01

    The Patient Protection and Affordable Care Act places significant emphasis on the role of community-based health promotion initiatives; within this focus, community and faith-based organizations (CFBOs) are seen as critical partners for improving community well-being. This article describes a report that provides the content for a toolkit that will prepare community and faith-based organizations to take advantage of opportunities presented in the Patient Protection and Affordable Care Act and engage faith and community leaders in promoting health in their communities. This includes key facts and figures about health topics, handouts for community groups, and web links for resources and other information in the following areas: healthcare reform; community health centers and development of the community health workforce; promotion of healthy families; mental health; violence and trauma; prevention of teen and unintended pregnancy and HIV/AIDS; and chronic disease prevention. The report also includes recommendations for testing the content of the toolkit with communities and considerations for its implementation.

  11. TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4

    NARCIS (Netherlands)

    Sharples, Paul; Popat, Kris; Llobet, Lau; Santos, Patricia; Hernández-Leo, Davinia; Miao, Yongwu; Griffiths, David; Beauvoir, Phillip

    2010-01-01

    Sharples, P., Popat, K., Llobet, L., Santos, P., Hernandez-Leo, D., Miao, Y., Griffiths, D. & Beauvoir, P. (2009) TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4 This release is composed of three files corresponding to CopperCore Service Integration (CCSI) v3.2-10cv1.4,

  12. How to create an interface between UrQMD and Geant4 toolkit

    CERN Document Server

    Abdel-Waged, Khaled; Uzhinskii, V.V.

    2012-01-01

    An interface between the UrQMD-1.3cr model (version 1.3 for cosmic air showers) and the Geant4 transport toolkit has been developed. Compared to the current Geant4 (hybrid) hadronic models, this provides the ability to simulate at the microscopic level hadron, nucleus, and anti-nucleus interactions with matter from 0 to 1 TeV with a single transport code. This document provides installation requirements and instructions, as well as class and member function descriptions of the software.

  13. Evaluating the parent-adolescent communication toolkit: Usability and preliminary content effectiveness of an online intervention.

    Science.gov (United States)

    Toombs, Elaine; Unruh, Anita; McGrath, Patrick

    2018-01-01

    This study aimed to assess the Parent-Adolescent Communication Toolkit, an online intervention designed to help improve parent communication with their adolescents. Participant preferences for two module delivery systems (sequential and unrestricted module access) were identified. Usability assessment of the PACT intervention was completed using pre-test and posttest comparisons. Usability data, including participant completion and satisfaction ratings were examined. Parents ( N  =   18) of adolescents were randomized to a sequential or unrestricted chapter access group. Parent participants completed pre-test measures, the PACT intervention and posttest measures. Participants provided feedback for the intervention to improve modules and provided usability ratings. Adolescent pre- and posttest ratings were evaluated. Usability ratings were high and parent feedback was positive. The sequential module access groups rated the intervention content higher and completed more content than the unrestricted chapter access group, indicating support for the sequential access design. Parent mean posttest communication scores were significantly higher ( p  Communication Toolkit has potential to improve parent-adolescent communication but further effectiveness assessment is required.

  14. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM).

  15. Development of a Tailored Methodology and Forensic Toolkit for Industrial Control Systems Incident Response

    Science.gov (United States)

    2014-06-01

    for industrial control systems ,” in Proceedings of the VDE Kongress, 2004. [15] K. Stouffer et al., “Special publication 800-82: Guide to industrial...TAILORED METHODOLOGY AND FORENSIC TOOLKIT FOR INDUSTRIAL CONTROL SYSTEMS INCIDENT RESPONSE by Nicholas B. Carr June 2014 Thesis Co...CONTROL SYSTEMS INCIDENT RESPONSE 5. FUNDING NUMBERS 6. AUTHOR(S) Nicholas B. Carr 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval

  16. University of Central Florida and the American Association of State Colleges and Universities: Blended Learning Toolkit

    Science.gov (United States)

    EDUCAUSE, 2014

    2014-01-01

    The Blended Learning Toolkit supports the course redesign approach, and interest in its openly available clearinghouse of online tools, strategies, curricula, and other materials to support the adoption of blended learning continues to grow. When the resource originally launched in July 2011, 20 AASCU [American Association of State Colleges and…

  17. An assessment toolkit to increase the resilience of NWE catchments to periods of drought

    Science.gov (United States)

    La Jeunesse, Isabelle; Larrue, Corinne

    2013-04-01

    In many North Western Europe (NWE) areas the balance between water demand and availability is under pressure, thus under water scarcity. In addition, NWE areas are adversely affected by changes in the hydrological cycle and precipitation patterns, thus droughts periods. Over the past thirty years, droughts have dramatically increased and NWE are not immune. The summer of 2003 caused 10 billion euro damage to agriculture. In April 2012 the South West of the UK has moved to environmental drought status. Water scarcity and drought problems in the EU are increasing: 11% of the European population and 17% of its territory have been affected to date. Climate change is likely to exacerbate these adverse impacts. 50% of the NWE area are planned to be affected in 2050. Although the problems caused by drought in NWE are currently not overwhelmingly visible early action should be taken to reduce costs and prevent damage. Adapting to drought in NWE is the transnational challenge of the DROP (governance in DROught adaPtation) project. The Commission's recent "Blue Print on European Waters" states that existing policies are good but the problem lays in implementation. So the future challenge for NWE regions is to improve the implementation, meaning both governance and measures. The problem of drought is relatively new in comparison with flooding for these Regions. This demands another approach with the interaction of different stakeholders. NWE countries have proven strategies for flood prevention; no such strategies exist for drought adaptation. To do this, DROP combines science, practitioners and decisions makers, realizing the science-policy window. Thus, the aim of the DROP project is to increase the resilience of NWE catchments to periods of drought. To tackle these issues DROP will develop a governance toolkit to be used by NWE regional water authorities and will test a few pilot measures on drought adaptation. The objectives of the project are 1) to promote the use of a

  18. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pshenichnov, Igor, E-mail: pshenich@fias.uni-frankfurt.d [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Botvina, Alexander [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Mishustin, Igor [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Kurchatov Institute, Russian Research Center, 123182 Moscow (Russian Federation); Greiner, Walter [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany)

    2010-03-15

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  19. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    International Nuclear Information System (INIS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-01-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  20. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  1. Adding Impacts and Mitigation Measures to OpenEI's RAPID Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, Erin

    2017-05-01

    The Open Energy Information platform hosts the Regulatory and Permitting Information Desktop (RAPID) Toolkit to provide renewable energy permitting information on federal and state regulatory processes. One of the RAPID Toolkit's functions is to help streamline the geothermal permitting processes outlined in the National Environmental Policy Act (NEPA). This is particularly important in the geothermal energy sector since each development phase requires separate land analysis to acquire exploration, well field drilling, and power plant construction permits. Using the Environmental Assessment documents included in RAPID's NEPA Database, the RAPID team identified 37 resource categories that a geothermal project may impact. Examples include impacts to geology and minerals, nearby endangered species, or water quality standards. To provide federal regulators, project developers, consultants, and the public with typical impacts and mitigation measures for geothermal projects, the RAPID team has provided overview webpages of each of these 37 resource categories with a sidebar query to reference related NEPA documents in the NEPA Database. This project is an expansion of a previous project that analyzed the time to complete NEPA environmental review for various geothermal activities. The NEPA review not only focused on geothermal projects within the Bureau of Land Management and U.S. Forest Service managed lands, but also projects funded by the Department of Energy. Timeline barriers found were: extensive public comments and involvement; content overlap in NEPA documents, and discovery of impacted resources such as endangered species or cultural sites.

  2. Can Mobile-Enabled Payment Methods Reduce Petty Corruption in Urban Water Provision?

    Directory of Open Access Journals (Sweden)

    Aaron Krolikowski

    2014-02-01

    Full Text Available Corruption in the urban water sector constrains economic growth and human development in low-income countries. This paper empirically evaluates the ability of novel mobile-enabled payment methods to reduce information asymmetries and mitigate petty corruption in the urban water sector’s billing and payment processes. Overcoming these barriers may promote improved governance and water service delivery. The case of Dar es Salaam is used to explore the role of mobile-enabled payment instruments through the use of a stratified random sample of 1097 water utility customers and 42 interviews with representatives from the water sector, the telecommunications industry, civil society, and banking institutions. Results show that mobile-enabled payment methods can reduce information asymmetries and the incidence of petty corruption to promote improved financial management by making payment data more transparent and limiting the availability of economic rents in the billing and payment process. Implications for African urban water services include wider availability and more effective use of human and financial resources. These can be used to enhance water service delivery and citizen participation in the production of urban water supplies. The use of mobile-enabled payment methods in the urban water sector represents an application of mobile communication technologies in a low-income country with proven potential for scalability that simultaneously supports the achievement of development objectives.

  3. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  4. Integrating surgical robots into the next medical toolkit.

    Science.gov (United States)

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  5. canvasDesigner: A versatile interactive high-resolution scientific multi-panel visualization toolkit.

    Science.gov (United States)

    Zhang, Baohong; Zhao, Shanrong; Neuhaus, Isaac

    2018-05-03

    We present a bioinformatics and systems biology visualization toolkit harmonizing real time interactive exploring and analyzing of big data, full-fledged customizing of look-n-feel, and producing multi-panel publication-ready figures in PDF format simultaneously. Source code and detailed user guides are available at http://canvasxpress.org, https://baohongz.github.io/canvasDesigner, and https://baohongz.github.io/canvasDesigner/demo_video.html. isaac.neuhaus@bms.com, baohong.zhang@pfizer.com, shanrong.zhao@pfizer.com. Supplementary materials are available at https://goo.gl/1uQygs.

  6. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    Science.gov (United States)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  7. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    Science.gov (United States)

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  8. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2012-01-01

    Full Text Available Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n=6 physicians defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n=29 health professionals revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF and the positive and negative components of the multilevel person-centered integrative diagnosis model.

  9. The GridSite Web/Grid security system

    International Nuclear Information System (INIS)

    McNab, Andrew; Li Yibiao

    2010-01-01

    We present an overview of the current status of the GridSite toolkit, describing the security model for interactive and programmatic uses introduced in the last year. We discuss our experiences of implementing these internal changes and how they and previous rounds of improvements have been prompted by requirements from users and wider security trends in Grids (such as CSRF). Finally, we explain how these have improved the user experience of GridSite-based websites, and wider implications for portals and similar web/grid sites.

  10. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    Science.gov (United States)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  11. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    Directory of Open Access Journals (Sweden)

    Cieślik Marcin

    2011-02-01

    Full Text Available Abstract Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'. A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption. An add-on module ('NuBio' facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures and functionality (e.g., to parse/write standard file formats. Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and

  12. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  13. Security Assessment Simulation Toolkit (SAST) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  14. A cosmology forecast toolkit — CosmoLib

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhiqi, E-mail: zqhuang@cita.utoronto.ca [CEA, Institut de Physique Théorique, Orme des Merisiers, Saint-Aubin, 91191 Gif-sur-Yvette Cédex (France)

    2012-06-01

    The package CosmoLib is a combination of a cosmological Boltzmann code and a simulation toolkit to forecast the constraints on cosmological parameters from future observations. In this paper we describe the released linear-order part of the package. We discuss the stability and performance of the Boltzmann code. This is written in Newtonian gauge and including dark energy perturbations. In CosmoLib the integrator that computes the CMB angular power spectrum is optimized for a l-by-l brute-force integration, which is useful for studying inflationary models predicting sharp features in the primordial power spectrum of metric fluctuations. As an application, CosmoLib is used to study the axion monodromy inflation model that predicts cosine oscillations in the primordial power spectrum. In contrast to the previous studies by Aich et al. and Meerburg et al., we found no detection or hint of the osicllations. We pointed out that the CAMB code modified by Aich et al. does not have sufficient numerical accuracy. CosmoLib and its documentation are available at http://www.cita.utoronto.ca/∼zqhuang/CosmoLib.

  15. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    Science.gov (United States)

    Unni, Samir; Huang, Yong; Hanson, Robert; Tobias, Malcolm; Krishnan, Sriram; Li, Wilfred W.; Nielsen, Jens E.; Baker, Nathan A.

    2011-01-01

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a Web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in current distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/. PMID:21425296

  16. Plug-and-play paper-based toolkit for rapid prototyping of microfluidics and electronics towards point-of-care diagnostic solutions

    CSIR Research Space (South Africa)

    Smith, S

    2015-11-01

    Full Text Available We present a plug-and-play toolkit for the rapid assembly of paper-based microfluidic and electronic components for quick prototyping of paper-based components towards point-of-care diagnostic solutions. Individual modules, each with a specific...

  17. Context in a wider context

    Directory of Open Access Journals (Sweden)

    John Traxler

    2011-07-01

    Full Text Available This paper attempts to review and reconsider the role of context in mobile learning and starts by outlining definitions of context-aware mobile learning as the technologies have become more mature, more robust and more widely available and as the notion of context has become progressively richer. The future role of context-aware mobile learning is considered within the context of the future of mobile learning as it moves from the challenges and opportunities of pedagogy and technology to the challenges and opportunities of policy, scale, sustainability, equity and engagement with augmented reality, «blended learning», «learner devices», «user-generated contexts» and the «internet of things». This is essentially a perspective on mobile learning, and other forms of technology-enhanced learning (TEL, where educators and their institutions set the agenda and manage change. There are, however, other perspectives on context. The increasing availability and use of smart-phones and other personal mobile devices with similar powerful functionality means that the experience of context for many people, in the form of personalized or location-based services, is an increasingly social and informal experience, rather than a specialist or educational experience. This is part of the transformative impact of mobility and connectedness on our societies brought about by these universal, ubiquitous and pervasive technologies. This paper contributes a revised understanding of context in the wider context (sic of the transformations taking place in our societies. These are subtle but pervasive transformations of jobs, work and the economy, of our sense of time, space and place, of knowing and learning, and of community and identity. This leads to a radical reconsideration of context as the notions of ‹self› and ‹other› are transformed.

  18. Study protocol for "Study of Practices Enabling Implementation and Adaptation in the Safety Net (SPREAD-NET)": a pragmatic trial comparing implementation strategies.

    Science.gov (United States)

    Gold, Rachel; Hollombe, Celine; Bunce, Arwen; Nelson, Christine; Davis, James V; Cowburn, Stuart; Perrin, Nancy; DeVoe, Jennifer; Mossman, Ned; Boles, Bruce; Horberg, Michael; Dearing, James W; Jaworski, Victoria; Cohen, Deborah; Smith, David

    2015-10-16

    Little research has directly compared the effectiveness of implementation strategies in any setting, and we know of no prior trials directly comparing how effectively different combinations of strategies support implementation in community health centers. This paper outlines the protocol of the Study of Practices Enabling Implementation and Adaptation in the Safety Net (SPREAD-NET), a trial designed to compare the effectiveness of several common strategies for supporting implementation of an intervention and explore contextual factors that impact the strategies' effectiveness in the community health center setting. This cluster-randomized trial compares how three increasingly hands-on implementation strategies support adoption of an evidence-based diabetes quality improvement intervention in 29 community health centers, managed by 12 healthcare organizations. The strategies are as follows: (arm 1) a toolkit, presented in paper and electronic form, which includes a training webinar; (arm 2) toolkit plus in-person training with a focus on practice change and change management strategies; and (arm 3) toolkit, in-person training, plus practice facilitation with on-site visits. We use a mixed methods approach to data collection and analysis: (i) baseline surveys on study clinic characteristics, to explore how these characteristics impact the clinics' ability to implement the tools and the effectiveness of each implementation strategy; (ii) quantitative data on change in rates of guideline-concordant prescribing; and (iii) qualitative data on the "how" and "why" underlying the quantitative results. The outcomes of interest are clinic-level results, categorized using the Reach, Effectiveness, Adoption, Implementation, Maintenance (RE-AIM) framework, within an interrupted time-series design with segmented regression models. This pragmatic trial will compare how well each implementation strategy works in "real-world" practices. Having a better understanding of how different

  19. VaST: A variability search toolkit

    Science.gov (United States)

    Sokolovsky, K. V.; Lebedev, A. A.

    2018-01-01

    Variability Search Toolkit (VaST) is a software package designed to find variable objects in a series of sky images. It can be run from a script or interactively using its graphical interface. VaST relies on source list matching as opposed to image subtraction. SExtractor is used to generate source lists and perform aperture or PSF-fitting photometry (with PSFEx). Variability indices that characterize scatter and smoothness of a lightcurve are computed for all objects. Candidate variables are identified as objects having high variability index values compared to other objects of similar brightness. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and handle complex image distortions. The software has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5 m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates. About 1800 variable stars have been discovered with VaST. It is used as a transient detection engine in the New Milky Way (NMW) nova patrol. The code is written in C and can be easily compiled on the majority of UNIX-like systems. VaST is free software available at http://scan.sai.msu.ru/vast/.

  20. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Chung, D.Y.

    1999-01-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45

  1. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  2. A Wider Look at Visual Discomfort

    Directory of Open Access Journals (Sweden)

    L O'Hare

    2012-07-01

    Full Text Available Visual discomfort is the adverse effects reported by some on viewing certain stimuli, such as stripes and certain filtered noise patterns. Stimuli that deviate from natural image statistics might be encoded inefficiently, which could cause discomfort (Juricevic, Land, Wilkins and Webster, 2010, Perception, 39(7, 884–899, possibly through excessive cortical responses (Wilkins, 1995, Visual Stress, Oxford, Oxford University Press. A less efficient visual system might exacerbate the effects of difficult stimuli. Extreme examples are seen in epilepsy and migraines (Wilkins, Bonnanni, Prociatti, Guerrini, 2004, Epilepsia, 45, 1–7; Aurora and Wilkinson, 2007, Cephalalgia, 27(12, 1422–1435. However, similar stimuli are also seen as uncomfortable by non-clinical populations, eg, striped patterns (Wilkins et al, 1984, Brain, 107(4. We propose that oversensitivity of clinical populations may represent extreme examples of visual discomfort in the general population. To study the prevalence and impact of visual discomfort in a wider context than typically studied, an Internet-based survey was conducted, including standardised questionnaires measuring visual discomfort susceptibility (Conlon, Lovegrove, Chekaluk and Pattison, 1999, Visual Cognition, 6(6, 637–663; Evans and Stevenson, 2008, Ophthal Physiol Opt 28(4 295–309 and judgments of visual stimuli, such as striped patterns (Wilkins et al, 1984 and filtered noise patterns (Fernandez and Wilkins, 2008, Perception, 37(7 1098–1013. Results show few individuals reporting high visual discomfort, contrary to other researchers (eg, Conlon et al, 1999.

  3. Systemic Planning: Dealing with Complexity by a Wider Approach to Planning

    DEFF Research Database (Denmark)

    Leleur, Steen

    2005-01-01

    and methodology that can be helpful for planning under circumstances characterised by complexity and uncertainty. It is argued that compared to conventional, planning – referred to as systematic planning - there is a need for a wider, more systemic approach to planning that is better suited to current real......On the basis of a new book Systemic Planning this paper addresses systems thinking and complexity in a context of planning. Specifically, renewal of planning thinking on this background is set out as so-called systemic planning (SP). The principal concern of SP is to provide principles...

  4. A Web-Based Toolkit to Provide Evidence-Based Resources About Crystal Methamphetamine for the Australian Community: Collaborative Development of Cracks in the Ice.

    Science.gov (United States)

    Champion, Katrina Elizabeth; Chapman, Cath; Newton, Nicola Clare; Brierley, Mary-Ellen; Stapinski, Lexine; Kay-Lambkin, Frances; Nagle, Jack; Teesson, Maree

    2018-03-20

    The use of crystal methamphetamine (ice) and the associated harms for individuals, families, and communities across Australia has been the subject of growing concern in recent years. The provision of easily accessible, evidence-based, and up-to-date information and resources about crystal methamphetamine for the community is a critical component of an effective public health response. This paper aims to describe the codevelopment process of the Web-based Cracks in the Ice Community Toolkit, which was developed to improve access to evidence-based information and resources about crystal methamphetamine for the Australian community. Development of the Cracks in the Ice Community Toolkit was conducted in collaboration with community members across Australia and with experts working in the addiction field. The iterative process involved the following: (1) consultation with end users, including community members, crystal methamphetamine users, families and friends of someone using crystal methamphetamine, health professionals, and teachers (n=451) via a cross-sectional Web-based survey to understand information needs; (2) content and Web development; and (3) user testing of a beta version of the Web-based toolkit among end users (n=41) and experts (n=10) to evaluate the toolkit's acceptability, relevance, and appeal. Initial end user consultation indicated that the most commonly endorsed reasons for visiting a website about crystal methamphetamine were "to get information for myself" (185/451, 41.0%) and "to find out how to help a friend or a family member" (136/451, 30.2%). Community consultation also revealed the need for simple information about crystal methamphetamine, including what it is, its effects, and when and where to seek help or support. Feedback on a beta version of the toolkit was positive in terms of content, readability, layout, look, and feel. Commonly identified areas for improvement related to increasing the level of engagement and personal connection

  5. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  6. GRHydro: a new open-source general-relativistic magnetohydrodynamics code for the Einstein toolkit

    International Nuclear Information System (INIS)

    Mösta, Philipp; Haas, Roland; Ott, Christian D; Reisswig, Christian; Mundim, Bruno C; Faber, Joshua A; Noble, Scott C; Bode, Tanja; Löffler, Frank; Schnetter, Erik

    2014-01-01

    We present the new general-relativistic magnetohydrodynamics (GRMHD) capabilities of the Einstein toolkit, an open-source community-driven numerical relativity and computational relativistic astrophysics code. The GRMHD extension of the toolkit builds upon previous releases and implements the evolution of relativistic magnetized fluids in the ideal MHD limit in fully dynamical spacetimes using the same shock-capturing techniques previously applied to hydrodynamical evolution. In order to maintain the divergence-free character of the magnetic field, the code implements both constrained transport and hyperbolic divergence cleaning schemes. We present test results for a number of MHD tests in Minkowski and curved spacetimes. Minkowski tests include aligned and oblique planar shocks, cylindrical explosions, magnetic rotors, Alfvén waves and advected loops, as well as a set of tests designed to study the response of the divergence cleaning scheme to numerically generated monopoles. We study the code’s performance in curved spacetimes with spherical accretion onto a black hole on a fixed background spacetime and in fully dynamical spacetimes by evolutions of a magnetized polytropic neutron star and of the collapse of a magnetized stellar core. Our results agree well with exact solutions where these are available and we demonstrate convergence. All code and input files used to generate the results are available on http://einsteintoolkit.org. This makes our work fully reproducible and provides new users with an introduction to applications of the code. (paper)

  7. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    Science.gov (United States)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  8. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Directory of Open Access Journals (Sweden)

    Kota Kasahara

    Full Text Available Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML, which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  9. Perspectives of healthcare providers and HIV-affected individuals and couples during the development of a Safer Conception Counseling Toolkit in Kenya: stigma, fears, and recommendations for the delivery of services.

    Science.gov (United States)

    Mmeje, Okeoma; Njoroge, Betty; Akama, Eliud; Leddy, Anna; Breitnauer, Brooke; Darbes, Lynae; Brown, Joelle

    2016-01-01

    Reproduction is important to many HIV-affected individuals and couples and healthcare providers (HCPs) are responsible for providing resources to help them safely conceive while minimizing the risk of sexual and perinatal HIV transmission. In order to fulfill their reproductive goals, HIV-affected individuals and their partners need access to information regarding safer methods of conception. The objective of this qualitative study was to develop a Safer Conception Counseling Toolkit that can be used to train HCPs and counsel HIV-affected individuals and couples in HIV care and treatment clinics in Kenya. We conducted a two-phased qualitative study among HCPs and HIV-affected individuals and couples from eight HIV care and treatment sites in Kisumu, Kenya. We conducted in-depth interviews (IDIs) and focus group discussions (FGDs) to assess the perspectives of HCPs and HIV-affected individuals and couples in order to develop and refine the content of the Toolkit. Subsequently, IDIs were conducted among HCPs who were trained using the Toolkit and FGDs among HIV-affected individuals and couples who were counseled with the Toolkit. HIV-related stigma, fears, and recommendations for delivery of safer conception counseling were assessed during the discussions. One hundred and six individuals participated in FGDs and IDIs; 29 HCPs, 49 HIV-affected women and men, and 14 HIV-serodiscordant couples. Participants indicated that a safer conception counseling and training program for HCPs is needed and that routine provision of safer conception counseling may promote maternal and child health by enhancing reproductive autonomy among HIV-affected couples. They also reported that the Toolkit may help dispel the stigma and fears associated with reproduction in HIV-affected couples, while supporting them in achieving their reproductive goals. Additional research is needed to evaluate the Safer Conception Toolkit in order to support its implementation and use in HIV care and

  10. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    Science.gov (United States)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  11. Barriers and enablers to physical activity participation in patients with COPD: a systematic review.

    Science.gov (United States)

    Thorpe, Olivia; Johnston, Kylie; Kumar, Saravana

    2012-01-01

    Physical activity (PA) has been shown to improve symptoms in people with chronic obstructive pulmonary disease (COPD). Despite the high health and financial costs, the uptake of management strategies, particularly participation in PA and pulmonary rehabilitation (PR), are low. The review objective here was to identify potential barriers and enablers, which people with COPD report being associated with their participation in PA programs, including PR. A systematic search was undertaken to identify studies (published Jan 2000 to Aug 2011) reporting any barriers and enablers experienced by people with COPD regarding participation in PA and PR. Methodological quality of the studies was appraised using McMaster critical appraisal tools. A narrative summary of findings was undertaken reporting on individual study characteristics, country of origin, participants, and potential barriers and enablers. Eleven studies (8 qualitative and 3 quantitative) met the inclusion criteria for this systematic review. Several methodological issues (small sampling, poor description of data collection and analysis, issues with generalizability of the research findings) were common among included studies. Barriers identified included changing health status, personal issues, lack of support, external factors, ongoing smoking, and program-specific barriers. Enablers identified included social support, professional support, personal drivers, personal benefit, control of condition, specific goals, and program-specific enablers. The findings from this review may assist health professionals, patients, care givers and the wider community to develop effective strategies to promote participation in PA and PR among people with COPD.

  12. A population-based randomized controlled trial of the effect of combining a pedometer with an intervention toolkit on physical activity among individuals with low levels of physical activity or fitness

    DEFF Research Database (Denmark)

    Petersen, Christina Bjørk; Severin, Maria; Hansen, Andreas Wolff

    2012-01-01

    To examine if receiving a pedometer along with an intervention toolkit is associated with increased physical activity, aerobic fitness and better self-rated health among individuals with low levels of physical activity or fitness.......To examine if receiving a pedometer along with an intervention toolkit is associated with increased physical activity, aerobic fitness and better self-rated health among individuals with low levels of physical activity or fitness....

  13. Water Security Toolkit User Manual: Version 1.3 | Science ...

    Science.gov (United States)

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  14. TMVA - Toolkit for Multivariate Data Analysis with ROOT Users guide

    CERN Document Server

    Höcker, A; Tegenfeldt, F; Voss, H; Voss, K; Christov, A; Henrot-Versillé, S; Jachowski, M; Krasznahorkay, A; Mahalalel, Y; Prudent, X; Speckmayer, P

    2007-01-01

    Multivariate machine learning techniques for the classification of data from high-energy physics (HEP) experiments have become standard tools in most HEP analyses. The multivariate classifiers themselves have significantly evolved in recent years, also driven by developments in other areas inside and outside science. TMVA is a toolkit integrated in ROOT which hosts a large variety of multivariate classification algorithms. They range from rectangular cut optimisation (using a genetic algorithm) and likelihood estimators, over linear and non-linear discriminants (neural networks), to sophisticated recent developments like boosted decision trees and rule ensemble fitting. TMVA organises the simultaneous training, testing, and performance evaluation of all these classifiers with a user-friendly interface, and expedites the application of the trained classifiers to the analysis of data sets with unknown sample composition.

  15. A tetO Toolkit To Alter Expression of Genes in Saccharomyces cerevisiae.

    Science.gov (United States)

    Cuperus, Josh T; Lo, Russell S; Shumaker, Lucia; Proctor, Julia; Fields, Stanley

    2015-07-17

    Strategies to optimize a metabolic pathway often involve building a large collection of strains, each containing different versions of sequences that regulate the expression of pathway genes. Here, we develop reagents and methods to carry out this process at high efficiency in the yeast Saccharomyces cerevisiae. We identify variants of the Escherichia coli tet operator (tetO) sequence that bind a TetR-VP16 activator with differential affinity and therefore result in different TetR-VP16 activator-driven expression. By recombining these variants upstream of the genes of a pathway, we generate unique combinations of expression levels. Here, we built a tetO toolkit, which includes the I-OnuI homing endonuclease to create double-strand breaks, which increases homologous recombination by 10(5); a plasmid carrying six variant tetO sequences flanked by I-OnuI sites, uncoupling transformation and recombination steps; an S. cerevisiae-optimized TetR-VP16 activator; and a vector to integrate constructs into the yeast genome. We introduce into the S. cerevisiae genome the three crt genes from Erwinia herbicola required for yeast to synthesize lycopene and carry out the recombination process to produce a population of cells with permutations of tetO variants regulating the three genes. We identify 0.7% of this population as making detectable lycopene, of which the vast majority have undergone recombination at all three crt genes. We estimate a rate of ∼20% recombination per targeted site, much higher than that obtained in other studies. Application of this toolkit to medically or industrially important end products could reduce the time and labor required to optimize the expression of a set of metabolic genes.

  16. Effect of the good school toolkit on school staff mental health, sense of job satisfaction and perceptions of school climate: Secondary analysis of a cluster randomised trial.

    Science.gov (United States)

    Kayiwa, Joshua; Clarke, Kelly; Knight, Louise; Allen, Elizabeth; Walakira, Eddy; Namy, Sophie; Merrill, Katherine G; Naker, Dipak; Devries, Karen

    2017-08-01

    The Good School Toolkit, a complex behavioural intervention delivered in Ugandan primary schools, has been shown to reduce school staff-perpetrated physical violence against students. We aimed to assess the effect of this intervention on staff members' mental health, sense of job satisfaction and perception of school climate. We analysed data from a cluster-randomised trial administered in 42 primary schools in Luwero district, Uganda. The trial was comprised of cross-sectional baseline (June/July 2012) and endline (June/July 2014) surveys among staff and students. Twenty-one schools were randomly selected to receive the Toolkit, whilst 21 schools constituted a wait-listed control group. We generated composite measures to assess staff members' perceptions of the school climate and job satisfaction. The trial is registered at clinicaltrials.gov (NCT01678846). No schools dropped out of the study and all 591 staff members who completed the endline survey were included in the analysis. Staff in schools receiving the Toolkit had more positive perspectives of their school climate compared to staff in control schools (difference in mean scores 2.19, 95% Confidence Interval 0.92, 3.39). We did not find any significant differences for job satisfaction and mental health. In conclusion, interventions like the Good School Toolkit that reduce physical violence by school staff against students can improve staff perceptions of the school climate, and could help to build more positive working and learning environments in Ugandan schools. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. H12: Examination of safety assessment aims, procedures and results from a wider perspective

    International Nuclear Information System (INIS)

    Neall, F.B; Smith, P.A.

    2004-04-01

    Safety assessment (SA) are a familiar tool for the evaluation of disposal concepts for radioactive waste. There is, however, often confusion in the wider community about the aims, methods and results used in SA. This report aims to present the H12 SA in a way that makes the assessment process clearer and the implications of the results more meaningful both to workers within the SA field and to a wider technical audience. The reasonableness of the assessment results, the quality of the models and databases and redundancy within the natural and engineered barrier system have been considered. A number of recent and somewhat older SAs that address a range of different waste types, host rocks and disposal concepts have been considered, and comparisons made to H12. A further aim is to put both doses and timescales in a more meaningful context. It has been necessary to: consider ways of demonstrating the meaningfulness of calculations that give results for many thousands of years in the future; provide a framework timescale as a context for SA results over long times; demonstrate the smallness of the risk associated with the doses by comparison with other radiological and non-radiological risks. The perception of risk, which is a critical issue for public acceptance of radioactive waste disposal and must be considered when seeking to present safety assessment results 'in perspective' to a wider audience, is also discussed. It is concluded that H12 is comparable in many ways to assessments carried out internationally. Some assumptions are somewhat arbitrary reflecting the generic stage of the Japanese programme, and are likely to become better founded in future exercises. Nevertheless, H12 provides a clear and well-founded message that it is feasible to site and construct a safe repository from HLW in Japan. (author)

  18. An internet-based bioinformatics toolkit for plant biosecurity diagnosis and surveillance of viruses and viroids.

    Science.gov (United States)

    Barrero, Roberto A; Napier, Kathryn R; Cunnington, James; Liefting, Lia; Keenan, Sandi; Frampton, Rebekah A; Szabo, Tamas; Bulman, Simon; Hunter, Adam; Ward, Lisa; Whattam, Mark; Bellgard, Matthew I

    2017-01-11

    Detection and preventing entry of exotic viruses and viroids at the border is critical for protecting plant industries trade worldwide. Existing post entry quarantine screening protocols rely on time-consuming biological indicators and/or molecular assays that require knowledge of infecting viral pathogens. Plants have developed the ability to recognise and respond to viral infections through Dicer-like enzymes that cleave viral sequences into specific small RNA products. Many studies reported the use of a broad range of small RNAs encompassing the product sizes of several Dicer enzymes involved in distinct biological pathways. Here we optimise the assembly of viral sequences by using specific small RNA subsets. We sequenced the small RNA fractions of 21 plants held at quarantine glasshouse facilities in Australia and New Zealand. Benchmarking of several de novo assembler tools yielded SPAdes using a kmer of 19 to produce the best assembly outcomes. We also found that de novo assembly using 21-25 nt small RNAs can result in chimeric assemblies of viral sequences and plant host sequences. Such non-specific assemblies can be resolved by using 21-22 nt or 24 nt small RNAs subsets. Among the 21 selected samples, we identified contigs with sequence similarity to 18 viruses and 3 viroids in 13 samples. Most of the viruses were assembled using only 21-22 nt long virus-derived siRNAs (viRNAs), except for one Citrus endogenous pararetrovirus that was more efficiently assembled using 24 nt long viRNAs. All three viroids found in this study were fully assembled using either 21-22 nt or 24 nt viRNAs. Optimised analysis workflows were customised within the Yabi web-based analytical environment. We present a fully automated viral surveillance and diagnosis web-based bioinformatics toolkit that provides a flexible, user-friendly, robust and scalable interface for the discovery and diagnosis of viral pathogens. We have implemented an automated viral surveillance and

  19. Creating an enabling environment for WR&R implementation.

    Science.gov (United States)

    Stathatou, P-M; Kampragou, E; Grigoropoulou, H; Assimacopoulos, D; Karavitis, C; Gironás, J

    2017-09-01

    Reclaimed water is receiving growing attention worldwide as an effective solution for alleviating the growing water scarcity in many areas. Despite the various benefits associated with reclaimed water, water recycling and reuse (WR&R) practices are not widely applied around the world. This is mostly due to complex and inadequate local legal and institutional frameworks and socio-economic structures, which pose barriers to wider WR&R implementation. An integrated approach is therefore needed while planning the implementation of WR&R schemes, considering all the potential barriers, and aiming to develop favourable conditions for enhancing reclaimed water use. This paper proposes a comprehensive methodology supporting the development of an enabling environment for WR&R implementation. The political, economic, social, technical, legal and institutional factors that may influence positively (drivers) or negatively (barriers) WR&R implementation in the regional water systems are identified, through the mapping of local stakeholder perceptions. The identified barriers are further analysed, following a Cross-Impact/System analysis, to recognize the most significant barriers inhibiting system transition, and to prioritize the enabling instruments and arrangements that are needed to boost WR&R implementation. The proposed methodology was applied in the Copiapó River Basin in Chile, which faces severe water scarcity. Through the analysis, it was observed that barriers outweigh drivers for the implementation of WR&R schemes in the Copiapó River Basin, while the key barriers which could be useful for policy formulation towards an enabling environment in the area concern the unclear legal framework regarding the ownership of treated wastewater, the lack of environmental policies focusing on pollution control, the limited integration of reclaimed water use in current land use and development policies, the limited public awareness on WR&R, and the limited availability of

  20. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  1. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  2. The Wider Importance of Cadavers: Educational and Research Diversity from a Body Bequest Program

    Science.gov (United States)

    Cornwall, Jon; Stringer, Mark D.

    2009-01-01

    The debate surrounding the use of cadavers in teaching anatomy has focused almost exclusively on the pedagogic role of cadaver dissection in medical education. The aim of this study was to explore the wider aspects of a body bequest program for teaching and research into gross anatomy in a University setting. A retrospective audit was undertaken…

  3. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  4. Preparing for the Flu (Including 2009 H1N1 Flu): A Communication Toolkit for Schools (Grades K-12)

    Science.gov (United States)

    Centers for Disease Control and Prevention, 2010

    2010-01-01

    The purpose of "Preparing for the Flu: A Communication Toolkit for Schools" is to provide basic information and communication resources to help school administrators implement recommendations from CDC's (Centers for Disease Control and Prevention) Guidance for State and Local Public Health Officials and School Administrators for School (K-12)…

  5. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  6. The ethics of drug development and promotion: the need for a wider view.

    Science.gov (United States)

    Brody, Howard

    2012-11-01

    Ethical issues at the interface between the medical profession and the pharmaceutical industry have generally been approached from the vantage point of medical professionalism, with a focus on conflict of interest as the key ethical concern. Although conflicts of interest remain important, other ethical issues may be obscured unless a wider perspective is adopted. Besides medical professionalism, the ethics of the clinical therapeutic relationship, ethics of public health, and business ethics all provide additional insights.

  7. Effects of toe-in and toe-in with wider step width on level walking knee biomechanics in varus, valgus, and neutral knee alignments.

    Science.gov (United States)

    Bennett, Hunter J; Shen, Guangping; Cates, Harold E; Zhang, Songning

    2017-12-01

    Increased peak external knee adduction moments exist for individuals with knee osteoarthritis and varus knee alignments, compared to healthy and neutrally aligned counterparts. Walking with increased toe-in or increased step width have been individually utilized to successfully reduce 1st and 2nd peak knee adduction moments, respectfully, but have not previously been combined or tested among all alignment groups. The purpose of this study was to compare toe-in only and toe-in with wider step width gait modifications in individuals with neutral, valgus, and varus alignments. Thirty-eight healthy participants with confirmed varus, neutral, or valgus frontal-plane knee alignment through anteroposterior radiographs, performed level walking in normal, toe-in, and toe-in with wider step width gaits. A 3×3 (group×intervention) mixed model repeated measures ANOVA compared alignment groups and gait interventions (pstep width compared to normal gait. The 2nd peak adduction moment was increased in toe-in compared to normal and toe-in with wider step width. The adduction impulse was also reduced in toe-in and toe-in with wider step width compared to normal gait. Peak knee flexion and external rotation moments were increased in toe-in and toe-in with wider step width compared to normal gait. Although the toe-in with wider step width gait seems to be a viable option to reduce peak adduction moments for varus alignments, sagittal, and transverse knee loadings should be monitored when implementing this gait modification strategy. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Using the 4 Pillars™ Immunization Toolkit to Increase Pneumococcal Immunizations for Older Adults: A Cluster Randomized Trial

    Science.gov (United States)

    Zimmerman, Richard K.; Brown, Anthony E.; Pavlik, Valory N.; Moehling, Krissy K.; Raviotta, Jonathan M.; Lin, Chyongchiou J.; Zhang, Song; Hawk, Mary; Kyle, Shakala; Patel, Suchita; Ahmed, Faruque; Nowalk, Mary Patricia

    2016-01-01

    BACKGROUND Quality improvement in primary care has focused on improving adult immunization. OBJECTIVES Test the effectiveness of a step-by step, evidence-based guide, the 4 Pillars™ Immunization Toolkit, to increase adult pneumococcal vaccination. DESIGN Randomized controlled cluster trial (RCCT) in Year 1 (6/1/2013–5/31/2014) and a pre-post study in Year 2 (6/1/2014–1/31/2015) with data analyzed in 2016. Baseline year was 6/1/2012–5/31/2013. Demographic and vaccination data were derived from de-identified EMR extractions. SETTING 25 primary care practices stratified by city (Houston, Pittsburgh), location (rural, urban, suburban) and type (family medicine, internal medicine), randomized to receive the intervention in Year 1 (n=13) or Year 2 (n=12). PARTICIPANTS A cohort of 18,107 patients ≥65 years at baseline with a mean age of 74.2 years; 60.7% were women, 16.5% were non-white and 15.7% were Hispanic. INTERVENTION The Toolkit, provider education, and one-on-one coaching of practice-based immunization champions. Outcome measures were 23-valent pneumococcal polysaccharide vaccine (PPSV) and pneumococcal conjugate vaccine (PCV) rates and percentage point (PP) changes. RESULTS In the RCCT, all intervention and control groups had significantly higher PPSV vaccination rates with average increases ranging from 6.5–8.7 PP (P<0.01). The intervention was not related to higher likelihood of PPSV vaccination. In the Year 2 pre-post study, the likelihood of PPSV and PCV vaccination was significantly higher in the active intervention sites than the maintenance sites in Pittsburgh, but not in Houston. CONCLUSION In a randomized controlled cluster trial, both intervention and control groups increased PPSV among adults ≥65 years. In a pre-post study, private primary care practices using the 4 Pillars™ Immunization Toolkit significantly improved PPSV and PCV uptake compared with practices that were in the maintenance phase of the study. PMID:27755655

  9. Multi-Stack Persistent Scatterer Interferometry Analysis in Wider Athens, Greece

    Directory of Open Access Journals (Sweden)

    Ioannis Papoutsis

    2017-03-01

    Full Text Available The wider Athens metropolitan area serves as an interesting setting for conducting geodetic studies. On the one hand, it has a complex regional geotectonic characteristic with several active and blind faults, one of which gave the deadly M w 5.9 Athens earthquake on September 1999. On the other hand, the Greek capital is heavily urbanized, and construction activities have been taking place in the last few decades to address the city’s needs for advanced infrastructures. This work focuses on estimating ground velocities for the wider Athens area in a period spanning two decades, with an extended spatial coverage, increased spatial sampling of the measurements and at high precision. The aim is to deliver to the community a reference geodetic database containing consistent and robust velocity estimates to support further studies for modeling and multi-hazard assessment. The analysis employs advanced persistent scatterer interferometry methods, covering Athens with both ascending and descending ERS-1, ERS-2 and Envisat Synthetic Aperture Radar data, forming six independent interferometric stacks. A methodology is developed and applied to exploit track diversity for decomposing the actual surface velocity field to its vertical and horizontal components and coping with the post-processing of the multi-track big data. Results of the time series analysis reveal that a large area containing the Kifisia municipality experienced non-linear motion; while it had been subsiding in the period 1992–1995 (−12 mm/year, the same area has been uplifting since 2005 (+4 mm/year. This behavior is speculated to have its origin on the regional water extraction activities, which when halted, led to a physical restoration phase of the municipality. In addition, a zoom in the area inflicted by the 1999 earthquake shows that there were zones of counter-force horizontal movement prior to the event. Further analysis is suggested to investigate the source and tectonic

  10. Saharan Rock Art: Local Dynamics and Wider Perspectives

    Directory of Open Access Journals (Sweden)

    Marina Gallinaro

    2013-12-01

    Full Text Available Rock art is the best known evidence of the Saharan fragile heritage. Thousands of engraved and painted artworks dot boulders and cliffs in open-air sites, as well as the rock walls of rockshelters and caves located in the main massifs. Since its pioneering discovery in the late 19th century, rock art captured the imagination of travellers and scholars, representing for a long time the main aim of research in the area. Chronology, meaning and connections between the different recognized artistic provinces are still to be fully understood. The central massifs, and in particular the "cultural province" encompassing Tadrart Acacus and Tassili n’Ajer, played and still play a key role in this scenario. Recent analytical and contextual analyses of rock art contexts seem to open new perspectives. Tadrart Acacus, for the richness and variability of artworks, for the huge archaeological data known, and for its proximity to other important areas with rock art (Tassili n’Ajjer, Algerian Tadrart and Messak massifs is an ideal context to analyze the artworks in their environmental and social-cultural context, and to define connections between cultural local dynamics and wider regional perspectives.

  11. From toolkit to framework: The past and future evolution of PhEDEx

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Hernandez, A. [CINVESTAV, IPN; Egeland, R. [Argosy U., Eagan; Huang, C. H. [Fermilab; Ratnikova, N. [Moscow, ITEP; Magini, N. [CERN; Wildish, T. [Princeton U.

    2012-01-01

    PhEDEx is the data-movement solution for CMS at the LHC. Created in 2004, it is now one of the longest-lived components of the CMS dataflow/workflow world. As such, it has undergone significant evolution over time, and continues to evolve today, despite being a fully mature system. Originally a toolkit of agents and utilities dedicated to specific tasks, it is becoming a more open framework that can be used in several ways, both within and beyond its original problem domain. In this talk we describe how a combination of refactoring and adoption of new technologies that have become available over the years have made PhEDEx more flexible, maintainable, and scaleable.

  12. Monte Carlo application based on GEANT4 toolkit to simulate a laser–plasma electron beam line for radiobiological studies

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, D., E-mail: debora.lamia@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Russo, G., E-mail: giorgio.russo@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Casarino, C.; Gagliano, L.; Candiano, G.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Labate, L. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); Gizzi, L.A. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Gilardi, M.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR, Segrate (Italy); University of Milano-Bicocca, Milano (Italy)

    2015-06-21

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system.

  13. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    Science.gov (United States)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and

  14. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  15. EUPAN enables pan-genome studies of a large number of eukaryotic genomes.

    Science.gov (United States)

    Hu, Zhiqiang; Sun, Chen; Lu, Kuang-Chen; Chu, Xixia; Zhao, Yue; Lu, Jinyuan; Shi, Jianxin; Wei, Chaochun

    2017-08-01

    Pan-genome analyses are routinely carried out for bacteria to interpret the within-species gene presence/absence variations (PAVs). However, pan-genome analyses are rare for eukaryotes due to the large sizes and higher complexities of their genomes. Here we proposed EUPAN, a eukaryotic pan-genome analysis toolkit, enabling automatic large-scale eukaryotic pan-genome analyses and detection of gene PAVs at a relatively low sequencing depth. In the previous studies, we demonstrated the effectiveness and high accuracy of EUPAN in the pan-genome analysis of 453 rice genomes, in which we also revealed widespread gene PAVs among individual rice genomes. Moreover, EUPAN can be directly applied to the current re-sequencing projects primarily focusing on single nucleotide polymorphisms. EUPAN is implemented in Perl, R and C ++. It is supported under Linux and preferred for a computer cluster with LSF and SLURM job scheduling system. EUPAN together with its standard operating procedure (SOP) is freely available for non-commercial use (CC BY-NC 4.0) at http://cgm.sjtu.edu.cn/eupan/index.html . ccwei@sjtu.edu.cn or jianxin.shi@sjtu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. The WHO-ITU national eHealth strategy toolkit as an effective approach to national strategy development and implementation.

    Science.gov (United States)

    Hamilton, Clayton

    2013-01-01

    With few exceptions, national eHealth strategies are the pivotal tools upon which the launch or refocusing of national eHealth programmes is hinged. The process of their development obviates cross-sector ministerial commitment led by the Ministry of Health. Yet countries often grapple with the task of strategy development and best efforts frequently fail to address strategic components of eHealth key to ensure successful implementation and stakeholder engagement. This can result in strategies that are narrowly focused, with an overemphasis placed on achieving technical outcomes. Without a clear link to a broader vision of health system development and a firm commitment from partners, the ability of a strategy to shape development of a national eHealth framework will be undermined and crucial momentum for implementation will be lost. WHO and ITU have sought to address this issue through the development of the National eHealth Strategy Toolkit that provides a basis for the components and processes to be considered in a strategy development or refocusing exercise. We look at this toolkit and highlight those areas which the countries should consider in formulating their national eHealth strategy.

  17. Augmented Reality and Mobile Learning: The State of the Art

    Science.gov (United States)

    FitzGerald, Elizabeth; Ferguson, Rebecca; Adams, Anne; Gaved, Mark; Mor, Yishay; Thomas, Rhodri

    2013-01-01

    In this paper, the authors examine the state of the art in augmented reality (AR) for mobile learning. Previous work in the field of mobile learning has included AR as a component of a wider toolkit but little has been done to discuss the phenomenon in detail or to examine in a balanced fashion its potential for learning, identifying both positive…

  18. Opacplot2: Enabling tabulated EoS and opacity compatibility for HEDLP simulations with the FLASH code

    Science.gov (United States)

    Laune, Jordan; Tzeferacos, Petros; Feister, Scott; Fatenejad, Milad; Yurchak, Roman; Flocke, Norbert; Weide, Klaus; Lamb, Donald

    2017-10-01

    Thermodynamic and opacity properties of materials are necessary to accurately simulate laser-driven laboratory experiments. Such data are compiled in tabular format since the thermodynamic range that needs to be covered cannot be described with one single theoretical model. Moreover, tabulated data can be made available prior to runtime, reducing both compute cost and code complexity. This approach is employed by the FLASH code. Equation of state (EoS) and opacity data comes in various formats, matrix-layouts, and file-structures. We discuss recent developments on opacplot2, an open-source Python module that manipulates tabulated EoS and opacity data. We present software that builds upon opacplot2 and enables easy-to-use conversion of different table formats into the IONMIX format, the native tabular input used by FLASH. Our work enables FLASH users to take advantage of a wider range of accurate EoS and opacity tables in simulating HELP experiments at the National Laser User Facilities.

  19. Java advanced medical image toolkit

    International Nuclear Information System (INIS)

    Saunder, T.H.C.; O'Keefe, G.J.; Scott, A.M.

    2002-01-01

    Full text: The Java Advanced Medical Image Toolkit (jAMIT) has been developed at the Center for PET and Department of Nuclear Medicine in an effort to provide a suite of tools that can be utilised in applications required to perform analysis, processing and visualisation of medical images. jAMIT uses Java Advanced Imaging (JAI) to combine the platform independent nature of Java with the speed benefits associated with native code. The object-orientated nature of Java allows the production of an extensible and robust package which is easily maintained. In addition to jAMIT, a Medical Image VO API called Sushi has been developed to provide access to many commonly used image formats. These include DICOM, Analyze, MINC/NetCDF, Trionix, Beat 6.4, Interfile 3.2/3.3 and Odyssey. This allows jAMIT to access data and study information contained in different medical image formats transparently. Additional formats can be added at any time without any modification to the jAMIT package. Tools available in jAMIT include 2D ROI Analysis, Palette Thresholding, Image Groping, Image Transposition, Scaling, Maximum Intensity Projection, Image Fusion, Image Annotation and Format Conversion. Future tools may include 2D Linear and Non-linear Registration, PET SUV Calculation, 3D Rendering and 3D ROI Analysis. Applications currently using JAMIT include Antibody Dosimetry Analysis, Mean Hemispheric Blood Flow Analysis, QuickViewing of PET Studies for Clinical Training, Pharamcodynamic Modelling based on Planar Imaging, and Medical Image Format Conversion. The use of jAMIT and Sushi for scripting and analysis in Matlab v6.1 and Jython is currently being explored. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  20. IOC-UNEP regional workshop to review priorities for marine pollution monitoring, research, control and abatement in the wider Caribbean

    International Nuclear Information System (INIS)

    1989-01-01

    The IOC-UNEP Regional Workshop to Review Priorities for Marine Pollution Monitoring, Research, Control and Abatement in the Wider Caribbean Region (San Jose, 24-30 August 1989) examined a possible general framework for a regionally co-ordinated comprehensive joint IOC/UNEP programme for marine pollution assessment and control in the Wider Caribbean region (CEPPOL). The overall objective of CEPPOL is to establish a regionally co-ordinated comprehensive joint IOC/UNEP Marine Pollution Assessment and Control Programme catering to the immediate and long-term requirements of the Cartagena Convention as well as the requirements of the member States of IOCARIBE. The specific objectives of the programmes are: (i) To organize and carry out a regionally co-ordinated marine pollution monitoring and research programme concentrating on contaminants and pollutants affecting the quality of the marine and coastal environment, as well as the human health in the Wider Caribbean and to interpret/assess the results of the programme as part of the scientific basis for the region; (ii) To generate information on the sources, levels, amounts, trends and effects of marine pollution within the Wider Caribbean region as an additional component of the scientific basis upon which the formulation of proposals for preventive and remedial actions can be based; (iii) To formulate proposals for technical, administrative and legal pollution control, abatement, and preventive measures and to assist the Governments in the region in implementing and evaluating their effectiveness; and (iv) To strengthen and , when necessary, to develop/establish the capabilities of national institutions to carry out marine pollution monitoring and research, as well as to formulate and apply pollution control and abatement measures

  1. Disease Prevention in the Age of Convergence - the Need for a Wider, Long Ranging and Collaborative Vision

    Directory of Open Access Journals (Sweden)

    Susan L. Prescott

    2014-01-01

    Full Text Available It is time to bring our imagination, creativity and passion to the fore in solving the global challenges of our age. Our global health crisis and the pandemic of noncommunicable diseases (NCDs is clearly rooted in complex modern societal and environmental changes, many of which have effects on developing immune and metabolic responses. It is intimately related to wider environmental challenges. And it is unsurprising that many NCDs share similar risk factors and that many are associated with a rising predisposition for inflammation. Allergy is one of the earliest signs of environmental impact on these biological pathways, and may also offer an early barometer to assess the effects of early interventions. There is dawning awareness of how changing microbial diversity, nutritional patterns, sedentary indoor behaviours and modern pollutants adversely affect early metabolic and immune development, but still much to understand the complexity of these interactions. Even when we do harness the science and technology, these will not provide solutions unless we also address the wider social, cultural and economic determinants of health - addressing the interconnections between human health and the health of our environment. Now more than ever, we need a wider vision and a greater sense of collective responsibility. We need long-range approaches that aim for life long benefits of a ‘healthier start to life’, and stronger cross-sectoral collaborations to prevent disease. We need to give both our hearts and our minds to solving these global issues.

  2. A Multipurpose Toolkit to Enable Advanced Genome Engineering in Plants

    Czech Academy of Sciences Publication Activity Database

    Čermák, Tomáš; Curtin, S.J.; Gil-Humanes, J.; Čegan, Radim; Kono, T.J.Y.; Konecna, E.; Belanto, J.J.; Starker, C.G.; Mathre, J.W.; Greenstein, R.L.; Voytas, D.F.

    2017-01-01

    Roč. 29, č. 6 (2017), s. 1196-1217 ISSN 1040-4651 Institutional support: RVO:68081707 Keywords : crispr /cas9-mediated targeted mutagenesis * zinc-finger nucleases * panicum-virgatum l. Subject RIV: CE - Biochemistry OBOR OECD: Biochemistry and molecular biology Impact factor: 8.688, year: 2016

  3. A Gateway MultiSite recombination cloning toolkit.

    Directory of Open Access Journals (Sweden)

    Lena K Petersen

    Full Text Available The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org.

  4. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    Directory of Open Access Journals (Sweden)

    Hamelryck Thomas

    2010-03-01

    Full Text Available Abstract Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs. It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations. Results The program package is freely available under the GNU General Public Licence (GPL from SourceForge http://sourceforge.net/projects/mocapy. The package contains the source for building the Mocapy++ library, several usage examples and the user manual. Conclusions Mocapy++ is especially suitable for constructing probabilistic models of biomolecular structure, due to its support for directional statistics. In particular, it supports the Kent distribution on the sphere and the bivariate von Mises distribution on the torus. These distributions have proven useful to formulate probabilistic models of protein and RNA structure in atomic detail.

  5. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    International Nuclear Information System (INIS)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J; Perl, J; Piersimoni, P; Ramos-Mendez, J; Faddegon, B

    2016-01-01

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  6. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J [Massachusetts General Hospital & Harvard Med. School, Boston, MA (United States); Perl, J [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Piersimoni, P; Ramos-Mendez, J; Faddegon, B [University of California, San Francisco, San Francisco, CA (United States)

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  7. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Merzari, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Obabko, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States); Tautges, Timothy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Solberg, Jerome [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferencz, Robert Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Whitesides, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  8. GEANT 4: an Object-Oriented toolkit for simulation in HEP

    CERN Multimedia

    Kent, P; Sirotenko, V; Komogorov, M; Pavliouk, A; Greeniaus, G L; Kayal, P I; Routenburg, P; Tanaka, S; Duellmann, D; Innocente, V; Paoli, S; Ranjard, F; Riccardi, F; Ruggier, M; Shiers, J; Egli, S; Kimura, A; Urban, P; Prior, S; Walkden, A; Forti, A; Magni, S; Strahl, K; Kokoulin, R; Braune, K; Volcker, C; Ullrich, T; Takahata, M; Nieminen, P; Ballocchi, G; Mora De Freitas, P; Verderi, M; Rybine, A; Langeveld, W; Nagamatsu, M; Hamatsu, R; Katayama, N; Chuma, J; Felawka, L; Gumplinger, P; Axen, D

    2002-01-01

    %RD44 %title\\\\ \\\\The GEANT4 software has been developed by a world-wide collaboration of about 100 scientists from over 40 institutions and laboratories participating in more than 10 experiments in Europe, Russia, Japan, Canada, and the United States. The GEANT4 detector simulation toolkit has been designed for the next generation of High Energy Physics (HEP) experiments, with primary requirements from the LHC, the CP violation, and the heavy ions experiments. In addition, GEANT4 also meets the requirements from the space and medical communities, thanks to very low energy extensions developed in a joint project with the European Space Agency (ESA). GEANT4 has exploited advanced software engineering techniques (for example PSS-05) and Object-Oriented technology to improve the validation process of the physics results, and in the same time to make possible the distributed software design and development in the world-wide collaboration. Fifteen specialised working groups have been responsible for fields as diver...

  9. The "Pesticides and Farmworker Health Toolkit" : An Innovative Model for Developing an Evidence-Informed Program for a Low-Literacy, Latino Immigrant Audience

    Science.gov (United States)

    LePrevost, Catherine E.; Storm, Julia F.; Asuaje, Cesar R.; Cope, W. Gregory

    2014-01-01

    Migrant and seasonal farmworkers are typically Spanish-speaking, Latino immigrants with limited formal education and low literacy skills and, as such, are a vulnerable population. We describe the development of the "Pesticides and Farmworker Health Toolkit", a pesticide safety and health curriculum designed to communicate to farmworkers…

  10. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    Science.gov (United States)

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  11. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, Ritesh, E-mail: ritesh@ipr.res.in; Swamy, Rajamannar, E-mail: rajamannar@ipr.res.in; Khirwadkar, Samir, E-mail: sameer@ipr.res.in

    2016-11-15

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  12. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    International Nuclear Information System (INIS)

    Sugandhi, Ritesh; Swamy, Rajamannar; Khirwadkar, Samir

    2016-01-01

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  13. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST

    Directory of Open Access Journals (Sweden)

    Oliver Melvin J

    2005-04-01

    Full Text Available Abstract Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST, which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN. W.ND-BLAST provides intuitive Graphic User Interfaces (GUI for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is

  14. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    Directory of Open Access Journals (Sweden)

    Lerendegui-Marco J.

    2017-01-01

    Full Text Available Monte Carlo (MC simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1, especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2 of the facility.

  15. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    Science.gov (United States)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  16. Wider-community Segregation and the Effect of Neighbourhood Ethnic Diversity on Social Capital: An Investigation into Intra-Neighbourhood Trust in Great Britain and London.

    Science.gov (United States)

    Laurence, James

    2017-10-01

    Extensive research has demonstrated that neighbourhood ethnic diversity is negatively associated with intra-neighbourhood social capital. This study explores the role of segregation and integration in this relationship. To do so it applies three-level hierarchical linear models to two sets of data from across Great Britain and within London, and examines how segregation across the wider-community in which a neighbourhood is nested impacts trust amongst neighbours. This study replicates the increasingly ubiquitous finding that neighbourhood diversity is negatively associated with neighbour-trust. However, we demonstrate that this relationship is highly dependent on the level of segregation across the wider-community in which a neighbourhood is nested. Increasing neighbourhood diversity only negatively impacts neighbour-trust when nested in more segregated wider-communities. Individuals living in diverse neighbourhoods nested within integrated wider-communities experience no trust-penalty. These findings show that segregation plays a critical role in the neighbourhood diversity/trust relationship, and that its absence from the literature biases our understanding of how ethnic diversity affects social cohesion.

  17. ‘Survival’: a simulation toolkit introducing a modular approach for radiobiological evaluations in ion beam therapy

    Science.gov (United States)

    Manganaro, L.; Russo, G.; Bourhaleb, F.; Fausti, F.; Giordanengo, S.; Monaco, V.; Sacchi, R.; Vignati, A.; Cirio, R.; Attili, A.

    2018-04-01

    One major rationale for the application of heavy ion beams in tumour therapy is their increased relative biological effectiveness (RBE). The complex dependencies of the RBE on dose, biological endpoint, position in the field etc require the use of biophysical models in treatment planning and clinical analysis. This study aims to introduce a new software, named ‘Survival’, to facilitate the radiobiological computations needed in ion therapy. The simulation toolkit was written in C++ and it was developed with a modular architecture in order to easily incorporate different radiobiological models. The following models were successfully implemented: the local effect model (LEM, version I, II and III) and variants of the microdosimetric-kinetic model (MKM). Different numerical evaluation approaches were also implemented: Monte Carlo (MC) numerical methods and a set of faster analytical approximations. Among the possible applications, the toolkit was used to reproduce the RBE versus LET for different ions (proton, He, C, O, Ne) and different cell lines (CHO, HSG). Intercomparison between different models (LEM and MKM) and computational approaches (MC and fast approximations) were performed. The developed software could represent an important tool for the evaluation of the biological effectiveness of charged particles in ion beam therapy, in particular when coupled with treatment simulations. Its modular architecture facilitates benchmarking and inter-comparison between different models and evaluation approaches. The code is open source (GPL2 license) and available at https://github.com/batuff/Survival.

  18. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    OpenAIRE

    Jon Smart; Michael Zdradzinski; Sarah Roth; Alecia Gende; Kylie Conroy; Nicole Battaglioli

    2018-01-01

    Introduction: Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods: As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a li...

  19. PDB@: an offline toolkit for exploration and analysis of PDB files.

    Science.gov (United States)

    Mani, Udayakumar; Ravisankar, Sadhana; Ramakrishnan, Sai Mukund

    2013-12-01

    Protein Data Bank (PDB) is a freely accessible archive of the 3-D structural data of biological molecules. Structure based studies offers a unique vantage point in inferring the properties of a protein molecule from structural data. This is too big a task to be done manually. Moreover, there is no single tool, software or server that comprehensively analyses all structure-based properties. The objective of the present work is to develop an offline computational toolkit, PDB@ containing in-built algorithms that help categorizing the structural properties of a protein molecule. The user has the facility to view and edit the PDB file to his need. Some features of the present work are unique in itself and others are an improvement over existing tools. Also, the representation of protein properties in both graphical and textual formats helps in predicting all the necessary details of a protein molecule on a single platform.

  20. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  1. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Science.gov (United States)

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  2. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Directory of Open Access Journals (Sweden)

    Shi-Yi Chen

    Full Text Available Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i genetic diversity of DNA sequences, (ii statistical tests for neutral evolution, and (iii measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  3. A physical and engineering study on the irradiation techniques in neutron capture therapy aiming for wider application

    International Nuclear Information System (INIS)

    Sakurai, Y.; Ono, K.; Suzuki, M.; Katoh, I.; Miyatake, S.-I.; Yanagie, H.

    2003-01-01

    The solo-irradiation of thermal neutrons has been applied for brain cancer and malignant melanoma in the boron neutron capture therapy (BNCT) at the medical irradiation facility of Kyoto University Reactor (KUR), from the first clinical trial in 1974. In 1997, after the facility remodeling, the application of the mix-irradiation of thermal and epi-thermal neutrons was started, and the depth dose distribution for brain cancer has been improved in some degree. In 2001, the solo-irradiation of epi-thermal neutrons also started. It is specially mentioned that the application to oral cancers started at the same time. The BNCT clinical trial using epi-thermal neutron irradiation at KUR, amounts to twelve as of March 2003. The seven trials; more than a half of the total trials, are for oral cancers. From this fact, we think that the wider application to the other cancers is required for the future prosperity of BNCT. The cancers applied for BNCT in KUR at the present time, are brain cancer, melanoma and oral cancers, as mentioned above. The cancers, expected to be applied in near future, are liver cancer, pancreas cancer, lung cancer, tongue cancer, breast cancer, etc.. Any cancer is almost incurable by the other therapy including the other radiation therapy. In the wider application of BNCT to these cancers, the dose-distribution control suitable to each cancer and/or each part, is important. The introduction of multi-directional and/or multi-divisional irradiation is also needed. Here, a physical and engineering study using two-dimensional transport calculation and three-dimensional Monte-Carlo simulation for the irradiation techniques in BNCT aiming for wider application is reported

  4. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    Science.gov (United States)

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  5. A Midas Plugin to Enable Construction of Reproducible Web-based Image Processing Pipelines

    Directory of Open Access Journals (Sweden)

    Michael eGrauer

    2013-12-01

    Full Text Available Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based UI, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  6. Health Equity Assessment Toolkit (HEAT: software for exploring and comparing health inequalities in countries

    Directory of Open Access Journals (Sweden)

    Ahmad Reza Hosseinpoor

    2016-10-01

    Full Text Available Abstract Background It is widely recognised that the pursuit of sustainable development cannot be accomplished without addressing inequality, or observed differences between subgroups of a population. Monitoring health inequalities allows for the identification of health topics where major group differences exist, dimensions of inequality that must be prioritised to effect improvements in multiple health domains, and also population subgroups that are multiply disadvantaged. While availability of data to monitor health inequalities is gradually improving, there is a commensurate need to increase, within countries, the technical capacity for analysis of these data and interpretation of results for decision-making. Prior efforts to build capacity have yielded demand for a toolkit with the computational ability to display disaggregated data and summary measures of inequality in an interactive and customisable fashion that would facilitate interpretation and reporting of health inequality in a given country. Methods To answer this demand, the Health Equity Assessment Toolkit (HEAT, was developed between 2014 and 2016. The software, which contains the World Health Organization’s Health Equity Monitor database, allows the assessment of inequalities within a country using over 30 reproductive, maternal, newborn and child health indicators and five dimensions of inequality (economic status, education, place of residence, subnational region and child’s sex, where applicable. Results/Conclusion HEAT was beta-tested in 2015 as part of ongoing capacity building workshops on health inequality monitoring. This is the first and only application of its kind; further developments are proposed to introduce an upload data feature, translate it into different languages and increase interactivity of the software. This article will present the main features and functionalities of HEAT and discuss its relevance and use for health inequality monitoring.

  7. Toolkit for data reduction to tuples for the ATLAS experiment

    International Nuclear Information System (INIS)

    Snyder, Scott; Krasznahorkay, Attila

    2012-01-01

    The final step in a HEP data-processing chain is usually to reduce the data to a ‘tuple’ form which can be efficiently read by interactive analysis tools such as ROOT. Often, this is implemented independently by each group analyzing the data, leading to duplicated effort and needless divergence in the format of the reduced data. ATLAS has implemented a common toolkit for performing this processing step. By using tools from this package, physics analysis groups can produce tuples customized for a particular analysis but which are still consistent in format and vocabulary with those produced by other physics groups. The package is designed so that almost all the code is independent of the specific form used to store the tuple. The code that does depend on this is grouped into a set of small backend packages. While the ROOT backend is the most used, backends also exist for HDF5 and for specialized databases. By now, the majority of ATLAS analyses rely on this package, and it is an important contributor to the ability of ATLAS to rapidly analyze physics data.

  8. YT: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Turk, Matthew J.; /San Diego, CASS; Smith, Britton D.; /Michigan State U.; Oishi, Jeffrey S.; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Skory, Stephen; Skillman, Samuel W.; /Colorado U., CASA; Abel, Tom; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Norman, Michael L.; /aff San Diego, CASS

    2011-06-23

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  9. yt: A MULTI-CODE ANALYSIS TOOLKIT FOR ASTROPHYSICAL SIMULATION DATA

    International Nuclear Information System (INIS)

    Turk, Matthew J.; Norman, Michael L.; Smith, Britton D.; Oishi, Jeffrey S.; Abel, Tom; Skory, Stephen; Skillman, Samuel W.

    2011-01-01

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  10. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi

    2017-10-23

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  11. Hard-to-fill vacancies.

    Science.gov (United States)

    Williams, Ruth

    2010-09-29

    Skills for Health has launched a set of resources to help healthcare employers tackle hard-to-fill entry-level vacancies and provide sustainable employment for local unemployed people. The Sector Employability Toolkit aims to reduce recruitment and retention costs for entry-level posts and repare people for employment through pre-job training programmes, and support employers to develop local partnerships to gain access to wider pools of candidates and funding streams.

  12. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  13. Modular toolkit for Data Processing (MDP: a Python data processing framework

    Directory of Open Access Journals (Sweden)

    Tiziano Zito

    2009-01-01

    Full Text Available Modular toolkit for Data Processing (MDP is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  14. iDC: A comprehensive toolkit for the analysis of residual dipolar couplings for macromolecular structure determination

    International Nuclear Information System (INIS)

    Wei Yufeng; Werner, Milton H.

    2006-01-01

    Measurement of residual dipolar couplings (RDCs) has become an important method for the determination and validation of protein or nucleic acid structures by NMRf spectroscopy. A number of toolkits have been devised for the handling of RDC data which run in the Linux/Unix operating environment and require specifically formatted input files. The outputs from these programs, while informative, require format modification prior to the incorporation of this data into commonly used personal computer programs for manuscript preparation. To bridge the gap between analysis and publication, an easy-to-use, comprehensive toolkit for RDC analysis has been created, iDC. iDC is written for the WaveMetrics Igor Pro mathematics program, a widely used graphing and data analysis software program that runs on both Windows PC and Mac OS X computers. Experimental RDC values can be loaded into iDC using simple data formats accessible to Igor's tabular data function. The program can perform most useful RDC analyses, including alignment tensor estimation from a histogram of RDC occurrence versus values and order tensor analysis by singular value decomposition (SVD). SVD analysis can be performed on an entire structure family at once, a feature missing in other applications of this kind. iDC can also import from and export to several different commonly used programs for the analysis of RDC data (DC, PALES, REDCAT) and can prepare formatted files for RDC-based refinement of macromolecular structures using XPLOR-NIH, CNS and ARIA. The graphical user interface provides an easy-to-use I/O for data, structures and formatted outputs

  15. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.

  16. A system for rapid prototyping of hearts with congenital malformations based on the medical imaging interaction toolkit (MITK)

    Science.gov (United States)

    Wolf, Ivo; Böttger, Thomas; Rietdorf, Urte; Maleike, Daniel; Greil, Gerald; Sieverding, Ludger; Miller, Stephan; Mottl-Link, Sibylle; Meinzer, Hans-Peter

    2006-03-01

    Precise knowledge of the individual cardiac anatomy is essential for diagnosis and treatment of congenital heart disease. Complex malformations of the heart can best be comprehended not from images but from anatomic specimens. Physical models can be created from data using rapid prototyping techniques, e.g., lasersintering or 3D-printing. We have developed a system for obtaining data that show the relevant cardiac anatomy from high-resolution CT/MR images and are suitable for rapid prototyping. The challenge is to preserve all relevant details unaltered in the produced models. The main anatomical structures of interest are the four heart cavities (atria, ventricles), the valves and the septum separating the cavities, and the great vessels. These can be shown either by reproducing the morphology itself or by producing a model of the blood-pool, thus creating a negative of the morphology. Algorithmically the key issue is segmentation. Practically, possibilities allowing the cardiologist or cardiac surgeon to interactively check and correct the segmentation are even more important due to the complex, irregular anatomy and imaging artefacts. The paper presents the algorithmic and interactive processing steps implemented in the system, which is based on the open-source Medical Imaging Interaction Toolkit (MITK, www.mitk.org). It is shown how the principles used in MITK enable to assemble the system from modules (functionalities) developed independently from each other. The system allows to produce models of the heart (and other anatomic structures) of individual patients as well as to reproduce unique specimens from pathology collections for teaching purposes.

  17. IN MY OPINION: Physics in the wider context

    Science.gov (United States)

    Morris, Andrew

    1999-11-01

    and progression opportunities for science specialists, whilst ensuring that the general public are scientifically literate. I think physics education has a serious contribution to make to all sections of society:The specialist, preparing for and progressing in a scientific/technological career. The skilled worker, analysing, understanding and innovating in any occupation. The citizen coping with increasing complexity in society. The individual trying to understanding the world into which they were born. To continue improving our educational systems and to assist each of these groups demands a grand alliance of people involved in physics education. Reflecting first on the wider context can help us choose appropriate points at which to intervene. Otherwise, educational improvement may be hampered, with valuable effort expended on positive reform actions rendered useless by constraints elsewhere in the system. How has the subject and its place in the curriculum evolved? What can be learned from previous curriculum innovations? What do public perceptions of physics tell us? The aim of the fifth Shaping the Future booklet is to encourage debate about where reform efforts should best be directed. Contributors will include Steve Adams, Michael Barnett, Sheila Carlton, John Berkeley, Martin Hollins, Marilyn Holyoake, Andrew Hunt, Roland Jackson, Jon Ogborn, Russell Stannard and Charles Thomas. A Discussion Meeting based on Physics in a wider context, at the ASE Annual Meeting, Leeds, promises to be lively. I hope you will come and express your views! If you would like to attend the meeting, to be held on 7 January 2000, and be sent a free copy of the manuscript for the 48 page booklet in advance, please contact: Ingrid Ebeyer, Post-16 Initiative, Institute of Physics, 76 Portland Place, London W1N 3DH (e-mail: 16-19project@iop.org)

  18. Early Detection of Clinically Significant Prostate Cancer Using Ultrasonic Acoustic Radiation Force Impulse (ARFI) Imaging

    Science.gov (United States)

    2017-10-01

    Toolkit for rapid 3D visualization and image volume interpretation, followed by automated transducer positioning in a user-selected image plane for... Toolkit (IGSTK) to enable rapid 3D visualization and image volume interpretation followed by automated transducer positioning in the user-selected... careers in science, technology, and the humanities. What do you plan to do during the next reporting period to accomplish the goals? If this

  19. Fluorescent Bisphosphonate and Carboxyphosphonate Probes: A Versatile Imaging Toolkit for Applications in Bone Biology and Biomedicine.

    Science.gov (United States)

    Sun, Shuting; Błażewska, Katarzyna M; Kadina, Anastasia P; Kashemirov, Boris A; Duan, Xuchen; Triffitt, James T; Dunford, James E; Russell, R Graham G; Ebetino, Frank H; Roelofs, Anke J; Coxon, Fraser P; Lundy, Mark W; McKenna, Charles E

    2016-02-17

    A bone imaging toolkit of 21 fluorescent probes with variable spectroscopic properties, bone mineral binding affinities, and antiprenylation activities has been created, including a novel linking strategy. The linking chemistry allows attachment of a diverse selection of dyes fluorescent in the visible to near-infrared range to any of the three clinically important heterocyclic bisphosphonate bone drugs (risedronate, zoledronate, and minodronate or their analogues). The resultant suite of conjugates offers multiple options to "mix and match" parent drug structure, fluorescence emission wavelength, relative bone affinity, and presence or absence of antiprenylation activity, for bone-related imaging applications.

  20. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  1. The Arabic culture of Jordan and its impacts on a wider Jordanian adoption of business continuity management.

    Science.gov (United States)

    Sawalha, Ihab H; Meaton, Julia

    2012-01-01

    Culture is important to individuals and societies, as well as organisations. Failing to address cultural aspects will hinder the wider adoption and development of business continuity management (BCM) and will subsequently increase the vulnerabilities of organisations to crises, disasters and business interruptions. Three main issues are discussed in this paper. The first is the background to culture and the characteristics of the Jordanian culture. Secondly, the influence of the Arab culture on the wider adoption and development of BCM in Jordan is considered. Thirdly, the paper looks at potential factors that underpin the role of culture in the BCM process in Jordan. These issues are significant, as they represent the characteristics and influence of the Arab culture. This paper contributes to the understanding of the significance of culture in the adoption and development of BCM for organisations operating in Jordan and in the Arab world more generally. It also highlights current cultural changes and trends taking place in the Arab world in a time of huge political instability in the Middle East and Arab countries.

  2. A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team

    Science.gov (United States)

    Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome

    2013-12-01

    The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation

  3. Integrating the protein and metabolic engineering toolkits for next-generation chemical biosynthesis.

    Science.gov (United States)

    Pirie, Christopher M; De Mey, Marjan; Jones Prather, Kristala L; Ajikumar, Parayil Kumaran

    2013-04-19

    Through microbial engineering, biosynthesis has the potential to produce thousands of chemicals used in everyday life. Metabolic engineering and synthetic biology are fields driven by the manipulation of genes, genetic regulatory systems, and enzymatic pathways for developing highly productive microbial strains. Fundamentally, it is the biochemical characteristics of the enzymes themselves that dictate flux through a biosynthetic pathway toward the product of interest. As metabolic engineers target sophisticated secondary metabolites, there has been little recognition of the reduced catalytic activity and increased substrate/product promiscuity of the corresponding enzymes compared to those of central metabolism. Thus, fine-tuning these enzymatic characteristics through protein engineering is paramount for developing high-productivity microbial strains for secondary metabolites. Here, we describe the importance of protein engineering for advancing metabolic engineering of secondary metabolism pathways. This pathway integrated enzyme optimization can enhance the collective toolkit of microbial engineering to shape the future of chemical manufacturing.

  4. A designated centre for people with disabilities operated by Brothers of Charity Services Ireland, Clare

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  5. A designated centre for people with disabilities operated by Stewarts Care Ltd, Dublin 20

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  6. Earlinet database: new design and new products for a wider use of aerosol lidar data

    Science.gov (United States)

    Mona, Lucia; D'Amico, Giuseppe; Amato, Francesco; Linné, Holger; Baars, Holger; Wandinger, Ulla; Pappalardo, Gelsomina

    2018-04-01

    The EARLINET database is facing a complete reshaping to meet the wide request for more intuitive products and to face the even wider request related to the new initiatives such as Copernicus, the European Earth observation programme. The new design has been carried out in continuity with the past, to take advantage from long-term database. In particular, the new structure will provide information suitable for synergy with other instruments, near real time (NRT) applications, validation and process studies and climate applications.

  7. When paradigms collide at the road rail interface: evaluation of a sociotechnical systems theory design toolkit for cognitive work analysis.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G

    2016-09-01

    The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.

  8. 5S rRNA Promoter for Guide RNA Expression Enabled Highly Efficient CRISPR/Cas9 Genome Editing in Aspergillus niger.

    Science.gov (United States)

    Zheng, Xiaomei; Zheng, Ping; Zhang, Kun; Cairns, Timothy C; Meyer, Vera; Sun, Jibin; Ma, Yanhe

    2018-04-30

    The CRISPR/Cas9 system is a revolutionary genome editing tool. However, in eukaryotes, search and optimization of a suitable promoter for guide RNA expression is a significant technical challenge. Here we used the industrially important fungus, Aspergillus niger, to demonstrate that the 5S rRNA gene, which is both highly conserved and efficiently expressed in eukaryotes, can be used as a guide RNA promoter. The gene editing system was established with 100% rates of precision gene modifications among dozens of transformants using short (40-bp) homologous donor DNA. This system was also applicable for generation of designer chromosomes, as evidenced by deletion of a 48 kb gene cluster required for biosynthesis of the mycotoxin fumonisin B1. Moreover, this system also facilitated simultaneous mutagenesis of multiple genes in A. niger. We anticipate that the use of the 5S rRNA gene as guide RNA promoter can broadly be applied for engineering highly efficient eukaryotic CRISPR/Cas9 toolkits. Additionally, the system reported here will enable development of designer chromosomes in model and industrially important fungi.

  9. CHASM and SNVBox: toolkit for detecting biologically important single nucleotide mutations in cancer.

    Science.gov (United States)

    Wong, Wing Chung; Kim, Dewey; Carter, Hannah; Diekhans, Mark; Ryan, Michael C; Karchin, Rachel

    2011-08-01

    Thousands of cancer exomes are currently being sequenced, yielding millions of non-synonymous single nucleotide variants (SNVs) of possible relevance to disease etiology. Here, we provide a software toolkit to prioritize SNVs based on their predicted contribution to tumorigenesis. It includes a database of precomputed, predictive features covering all positions in the annotated human exome and can be used either stand-alone or as part of a larger variant discovery pipeline. MySQL database, source code and binaries freely available for academic/government use at http://wiki.chasmsoftware.org, Source in Python and C++. Requires 32 or 64-bit Linux system (tested on Fedora Core 8,10,11 and Ubuntu 10), 2.5*≤ Python 5.0, 60 GB available hard disk space (50 MB for software and data files, 40 GB for MySQL database dump when uncompressed), 2 GB of RAM.

  10. Using the NLN Faculty Preparation for Global Experiences Toolkit for Successful Application for the Fulbright Scholar Award.

    Science.gov (United States)

    Samawi, Zepure; Capps, Lisa; Hansen, Ruth

    With an increasingly global world and the migration of diverse populations, nurse faculty have opportunities to learn and share varied perspectives through involvement internationally in research, teaching, and practice. The National League for Nursing (NLN) joins with the World Health Organization and the International Council of Nurses to promote international nursing standards. One way in which nursing faculty can contribute to this goal is by pursuing international education, research, and service as a Fulbright scholar. The NLN Faculty Preparation for Global Experiences Toolkit complements resources offered through the Fulbright program in the preparation of a competitive Fulbright application.

  11. Análisis comparativo de distintas toolkits para el reconocimiento biométrico de personas mediante voz

    OpenAIRE

    Ruíz, Silvia; Miranda, Ernesto; Herlein, Mauro; Etchart, Graciela; Alvez, Carlos E.

    2017-01-01

    El objetivo de este trabajo es realizar un análisis comparativo de distintas toolkits para el reconocimiento biométrico de personas mediante voz. Hoy en día los sistemas de identificación de personas se han convertido en una necesidad para la sociedad. A medida que avanza la tecnología y la aplicación de la misma en entornos tanto de ocio como de seguridad, la evolución en desarrollo biométrico es muy grande. Los sistemas de identificación o verificación tradicionales (tarjetas o claves) se h...

  12. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    Science.gov (United States)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  13. Dosimetry applications in GATE Monte Carlo toolkit.

    Science.gov (United States)

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Roofline model toolkit: A practical tool for architectural and program analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Yu Jung [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ligocki, Terry J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cordery, Matthew J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hall, Mary W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-18

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measure sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.

  16. Fin whale sound reception mechanisms: skull vibration enables low-frequency hearing.

    Directory of Open Access Journals (Sweden)

    Ted W Cranford

    Full Text Available Hearing mechanisms in baleen whales (Mysticeti are essentially unknown but their vocalization frequencies overlap with anthropogenic sound sources. Synthetic audiograms were generated for a fin whale by applying finite element modeling tools to X-ray computed tomography (CT scans. We CT scanned the head of a small fin whale (Balaenoptera physalus in a scanner designed for solid-fuel rocket motors. Our computer (finite element modeling toolkit allowed us to visualize what occurs when sounds interact with the anatomic geometry of the whale's head. Simulations reveal two mechanisms that excite both bony ear complexes, (1 the skull-vibration enabled bone conduction mechanism and (2 a pressure mechanism transmitted through soft tissues. Bone conduction is the predominant mechanism. The mass density of the bony ear complexes and their firmly embedded attachments to the skull are universal across the Mysticeti, suggesting that sound reception mechanisms are similar in all baleen whales. Interactions between incident sound waves and the skull cause deformations that induce motion in each bony ear complex, resulting in best hearing sensitivity for low-frequency sounds. This predominant low-frequency sensitivity has significant implications for assessing mysticete exposure levels to anthropogenic sounds. The din of man-made ocean noise has increased steadily over the past half century. Our results provide valuable data for U.S. regulatory agencies and concerned large-scale industrial users of the ocean environment. This study transforms our understanding of baleen whale hearing and provides a means to predict auditory sensitivity across a broad spectrum of sound frequencies.

  17. The AAG's ALIGNED Toolkit: A Place-based Approach to Fostering Diversity in the Geosciences

    Science.gov (United States)

    Rodrigue, C. M.

    2012-12-01

    Where do we look to attract a more diverse group of students to academic programs in geography and the geosciences? What do we do once we find them? This presentation introduces the ALIGNED Toolkit developed by the Association of American Geographers, with funding from the NSF's Opportunities to Enhance Diversity in the Geosciences (OEDG) Program. ALIGNED (Addressing Locally-tailored Information Infrastructure and Geoscience Needs for Enhancing Diversity) seeks to align the needs of university departments and underrepresented students by drawing upon the intellectual wealth of geography and spatial science to provide better informed, knowledge-based action to enhance diversity in higher education and the geoscience workforce. The project seeks to inform and transform the ways in which departments and programs envision and realize their own goals to enhance diversity, promote inclusion, and broaden participation. We also seek to provide the data, information, knowledge, and best practices needed in order to enhance the recruitment and retention of underrepresented students. The ALIGNED Toolkit is currently in a beta release, available to 13 pilot departments and 50 testing departments of geography/geosciences. It consolidates a variety of data from departments, the U.S. Census Bureau, and the U.S. Department of Education's National Center for Education Statistics to provide interactive, GIS-based visualizations across multiple scales. It also incorporates a place-based, geographic perspective to support departments in their efforts to enhance diversity. A member of ALIGNED's senior personnel, who is also a representative of one of the pilot departments, will provide an overview and preview of the tool while sharing her department's experiences in progressing toward its diversity goals. A brief discussion on how geoscience departments might benefit from the ALIGNED approach and resources will follow. Undergraduate advisors, graduate program directors, department

  18. An investigation into the content validity of the Antimicrobial Self-Assessment Toolkit for NHS Trusts (ASAT v15a) using cognitive interviews with antimicrobial pharmacists.

    Science.gov (United States)

    Bailey, C; Tully, M; Cooke, J

    2015-04-01

    The Antimicrobial Self-Assessment Toolkit for NHS Trusts (ASAT) was developed to evaluate the organizational strategies used to implement hospital-based antimicrobial stewardship programmes. An iterative approach was used to develop ASAT v15a, which has been previously investigated for face validity; however, further investigation into other types of validity was required. Therefore, the aim of this study was to investigate the content validity of ASAT v15a and hence modify and improve the content validity of the toolkit. A purposive sample of eight antimicrobial pharmacists was interviewed using cognitive interviewing techniques from within the former North-west Strategic Health Authority in England. Respondents were asked to 'think aloud' and to verbally express their thought processes as they generated responses to each question with the ASAT. There were no cognitive difficulties reported by respondents in response to 26/83 (31·3%) questions within the ASAT. However, cognitive difficulties were reported by respondents at each stage of the cognitive processing pathway in response to 57/83 (68·7%) questions. These difficulties were comprehension/interpretation in 27/83 (32·5%) questions, information retrieval in 10/83 (12%) questions, judgment/decision in 6/83 (7·2%) questions and response generation/formatting in 13/83 (15·7%) questions. Other findings included disagreement with the weightings applied to 13/83 (15·7%) questions. Respondents recommended that these questions should be modified to reflect their impact on hospital-based antimicrobial stewardship programmes (ASPs). Based on these findings, modifications were made to ASAT v15a to produce the next iteration (ASAT v16). Furthermore, respondents indicated that the role of clinical microbiologists was underrepresented in the current version of the toolkit; therefore, seven proposed questions were drafted, based on a literature review. Cognitive interviews were effectively able to detect problems

  19. The gputools package enables GPU computing in R.

    Science.gov (United States)

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  20. Geodetic, Geologic and Seismic Interdisciplinary Research of Tectonically Caused Movements in the Wider Area of the City of Zagreb

    Science.gov (United States)

    Dapo, A.; Pribicevic, B.; Herak, M.; Prelogovic, E.

    2012-04-01

    Since the last great earthquake in 1880 which shook the Zagreb area with IX° MCS, tectonic movements and models of numerous Zagreb faults have been the focal point of Croatian geologists, seismologists and in the last 15 years also geodetic scientists, who all have been working in the scope of their scientific branches on bringing the light to the tectonic mechanisms in the wider Zagreb area. Since it is tectonically very active area and being the Capitol city of the Croatia with very high population density it is of utmost importance to understand those mechanisms and to according to them find the best possible measures for protecting people and valuables. Best results are certainly going to be achieved through the interdisciplinary approach. That is why this paper presents first interdisciplinary results from geodetic, geologic and seismic researches and their contribution to the collective knowledge about tectonic movements in the wider area of the City of Zagreb.

  1. Multi-Agent Systems for E-Commerce

    OpenAIRE

    Solodukha, T. V.; Sosnovskiy, O. A.; Zhelezko, B. A.

    2009-01-01

    The article focuses on multi-agent systems (MAS) and domains that can benefit from multi-agent technology. In the last few years, the agent based modeling (ABM) community has developed several practical agent based modeling toolkits that enable individuals to develop agent-based applications. The comparison of agent-based modeling toolkits is given. Multi-agent systems are designed to handle changing and dynamic business processes. Any organization with complex and distributed business pro...

  2. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  3. The Climate Resilience Toolkit: Central gateway for risk assessment and resilience planning at all governance scales

    Science.gov (United States)

    Herring, D.; Lipschultz, F.

    2016-12-01

    As people and organizations grapple with a changing climate amid a range of other factors simultaneously shifting, there is a need for credible, legitimate & salient scientific information in useful formats. In addition, an assessment framework is needed to guide the process of planning and implementing projects that allow communities and businesses to adapt to specific changing conditions, while also building overall resilience to future change. We will discuss how the U.S. Climate Resilience Toolkit (CRT) can improve people's ability to understand and manage their climate-related risks and opportunities, and help them make their communities and businesses more resilient. In close coordination with the U.S. Climate Data Initiative, the CRT is continually evolving to offer actionable authoritative information, relevant tools, and subject matter expertise from across the U.S. federal government in one easy-to-use location. The Toolkit's "Climate Explorer" is designed to help people understand potential climate conditions over the course of this century. It offers easy access to downloadable maps, graphs, and data tables of observed and projected temperature, precipitation and other decision-relevant climate variables dating back to 1950 and out to 2100. Since climate is only one of many changing factors affecting decisions about the future, it also ties climate information to a wide range of relevant variables to help users explore vulnerabilities and impacts. New topic areas have been added, such as "Fisheries," "Regions," and "Built Environment" sections that feature case studies and personal experiences in making adaptation decisions. A curated "Reports" section is integrated with semantic web capabilities to help users locate the most relevant information sources. As part of the USGCRP's sustained assessment process, the CRT is aligning with other federal activities, such as the upcoming 4th National Climate Assessment.

  4. Swiss Life Sciences - a science communication project for both schools and the wider public led by the foundation Science et Cité.

    Science.gov (United States)

    Röthlisberger, Michael

    2012-01-01

    The foundation Science et Cité was founded 1998 with the aim to inform the wider Swiss public about current scientific topics and to generate a dialogue between science and society. Initiated as an independent foundation by the former State Secretary for Science and Research, Dr. Charles Kleiber, Science et Cité is now attached to the Swiss Academies of Arts and Sciences as a competence center for dialogue with the public. Due to its branches in all language regions of the country, the foundation is ideally suited to initiate and implement communication projects on a nationwide scale. These projects are subdivided into three categories: i) science communication for children/adolescents, ii) establishing a dialogue between science and the wider public, and iii) conducting the role of a national center of competence and networking in science communication. Swiss Life Sciences is a project that fits into all of these categories: a year-round program for schools is complemented with an annual event for the wider public. With the involvement of most of the major Swiss universities, the Swiss National Science Foundation, the foundation Gen Suisse and many other partners, Swiss Life Sciences also sets an example of national networking within the science communication community.

  5. Assessing gendered roles in water decision-making in semi-arid regions through sex-disaggregated water data with UNESCO-WWAP gender toolkit

    Science.gov (United States)

    Miletto, Michela; Greco, Francesca; Belfiore, Elena

    2017-04-01

    Global climate change is expected to exacerbate current and future stresses on water resources from population growth and land use, and increase the frequency and severity of droughts and floods. Women are more vulnerable to the effects of climate change than men not only because they constitute the majority of the world's poor but also because they are more dependent for their livelihood on natural resources that are threatened by climate change. In addition, social, economic and political barriers often limit their coping capacity. Women play a key role in the provision, management and safeguarding of water, nonetheless, gender inequality in water management framework persists around the globe. Sharp data are essential to inform decisions and support effective policies. Disaggregating water data by sex is crucial to analyse gendered roles in the water realm and inform gender sensitive water policies in light of the global commitments to gender equality of Agenda 2030. In view of this scenario, WWAP has created an innovative toolkit for sex-disaggregated water data collection, as a result of a participatory work of more than 35 experts, part of the WWAP Working Group on Sex-Disaggregated Indicators (http://www.unesco.org/new/en/natural-sciences/environment/water/wwap/water-and-gender/un-wwap-working-group-on-gender-disaggregated-indicators/#c1430774). The WWAP toolkit contains four tools: the methodology (Seager J. WWAP UNESCO, 2015), set of key indicators, the guideline (Pangare V.,WWAP UNESCO, 2015) and a questionnaire for field survey. WWAP key gender-sensitive indicators address water resources management, aspects of water quality and agricultural uses, water resources governance and management, and investigate unaccounted labour in according to gender and age. Managing water resources is key for climate adaptation. Women are particularly sensitive to water quality and the health of water-dependent ecosystems, often source of food and job opportunities

  6. Design and validation of a three-instrument toolkit for the assessment of competence in electrocardiogram rhythm recognition.

    Science.gov (United States)

    Hernández-Padilla, José M; Granero-Molina, José; Márquez-Hernández, Verónica V; Suthers, Fiona; López-Entrambasaguas, Olga M; Fernández-Sola, Cayetano

    2017-06-01

    Rapid and accurate interpretation of cardiac arrhythmias by nurses has been linked with safe practice and positive patient outcomes. Although training in electrocardiogram rhythm recognition is part of most undergraduate nursing programmes, research continues to suggest that nurses and nursing students lack competence in recognising cardiac rhythms. In order to promote patient safety, nursing educators must develop valid and reliable assessment tools that allow the rigorous assessment of this competence before nursing students are allowed to practise without supervision. The aim of this study was to develop and psychometrically evaluate a toolkit to holistically assess competence in electrocardiogram rhythm recognition. Following a convenience sampling technique, 293 nursing students from a nursing faculty in a Spanish university were recruited for the study. The following three instruments were developed and psychometrically tested: an electrocardiogram knowledge assessment tool (ECG-KAT), an electrocardiogram skills assessment tool (ECG-SAT) and an electrocardiogram self-efficacy assessment tool (ECG-SES). Reliability and validity (content, criterion and construct) of these tools were meticulously examined. A high Cronbach's alpha coefficient demonstrated the excellent reliability of the instruments (ECG-KAT=0.89; ECG-SAT=0.93; ECG-SES=0.98). An excellent context validity index (scales' average content validity index>0.94) and very good criterion validity were evidenced for all the tools. Regarding construct validity, principal component analysis revealed that all items comprising the instruments contributed to measure knowledge, skills or self-efficacy in electrocardiogram rhythm recognition. Moreover, known-groups analysis showed the tools' ability to detect expected differences in competence between groups with different training experiences. The three-instrument toolkit developed showed excellent psychometric properties for measuring competence in

  7. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature

    Directory of Open Access Journals (Sweden)

    Ning Ye

    2015-01-01

    Full Text Available The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  8. HYDROGEOLOGICAL AND HYDROGEOCHEMICAL CHARACTERISTICS OF A WIDER AREA OF THE REGIONAL WELL FIELD EASTERN SLAVONIA – SIKIREVCI

    Directory of Open Access Journals (Sweden)

    Jasna Kopić

    2016-10-01

    Full Text Available This paper establishes hydrogeological and hydrogeochemical characteristics of a wider area of the regional well field Eastern Slavonia - Sikirevci. The research was conducted based on data gathered from the area of the Federation of Bosnia and Herzegovina and the Republic of Croatia. The aquifer Velika Kopanica is situated at the territory of the Republic of Croatia in the triangular region formed between Kopanica, Gundinci and Kruševica. The River Sava partially flows through it and the aquifer extends beneath the river to the territory of the Federation of Bosnia and Herzegovina from Donji Svilaj in the West to Domaljevac in the East where its yield is the highest. The thickness of the aquifer decreases towards the water body Odžak. It was determined that the groundwater which is extracted from wells of the wider area of the regional well field contains iron, manganese, natural ammonia and arsenic in values exceeding the maximum allowable concentration for drinking water. The increased values of these parameters are a result of mineral composition and reductive conditions in the aquifer environment. By means of a multivariate statistic cluster analysis, an overview of groups of elements is provided based on geochemical affinity and/or origin.

  9. Understanding enabling capacities for managing the 'wicked problem' of nonpoint source water pollution in catchments: a conceptual framework.

    Science.gov (United States)

    Patterson, James J; Smith, Carl; Bellamy, Jennifer

    2013-10-15

    Nonpoint source (NPS) water pollution in catchments is a 'wicked' problem that threatens water quality, water security, ecosystem health and biodiversity, and thus the provision of ecosystem services that support human livelihoods and wellbeing from local to global scales. However, it is a difficult problem to manage because water catchments are linked human and natural systems that are complex, dynamic, multi-actor, and multi-scalar in nature. This in turn raises questions about understanding and influencing change across multiple levels of planning, decision-making and action. A key challenge in practice is enabling implementation of local management action, which can be influenced by a range of factors across multiple levels. This paper reviews and synthesises important 'enabling' capacities that can influence implementation of local management action, and develops a conceptual framework for understanding and analysing these in practice. Important enabling capacities identified include: history and contingency; institutional arrangements; collaboration; engagement; vision and strategy; knowledge building and brokerage; resourcing; entrepreneurship and leadership; and reflection and adaptation. Furthermore, local action is embedded within multi-scalar contexts and therefore, is highly contextual. The findings highlight the need for: (1) a systemic and integrative perspective for understanding and influencing change for managing the wicked problem of NPS water pollution; and (2) 'enabling' social and institutional arenas that support emergent and adaptive management structures, processes and innovations for addressing NPS water pollution in practice. These findings also have wider relevance to other 'wicked' natural resource management issues facing similar implementation challenges. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. HBIM and augmented information: towards a wider user community of image and range-based reconstructions

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F.

    2015-08-01

    This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes). The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.

  11. HBIM and augmented information: towards a wider user community of image and range-based reconstructions

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2015-08-01

    Full Text Available This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes. The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.

  12. Using the Model Coupling Toolkit to couple earth system models

    Science.gov (United States)

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  13. Rapid parameterization of small molecules using the Force Field Toolkit.

    Science.gov (United States)

    Mayne, Christopher G; Saam, Jan; Schulten, Klaus; Tajkhorshid, Emad; Gumbart, James C

    2013-12-15

    The inability to rapidly generate accurate and robust parameters for novel chemical matter continues to severely limit the application of molecular dynamics simulations to many biological systems of interest, especially in fields such as drug discovery. Although the release of generalized versions of common classical force fields, for example, General Amber Force Field and CHARMM General Force Field, have posited guidelines for parameterization of small molecules, many technical challenges remain that have hampered their wide-scale extension. The Force Field Toolkit (ffTK), described herein, minimizes common barriers to ligand parameterization through algorithm and method development, automation of tedious and error-prone tasks, and graphical user interface design. Distributed as a VMD plugin, ffTK facilitates the traversal of a clear and organized workflow resulting in a complete set of CHARMM-compatible parameters. A variety of tools are provided to generate quantum mechanical target data, setup multidimensional optimization routines, and analyze parameter performance. Parameters developed for a small test set of molecules using ffTK were comparable to existing CGenFF parameters in their ability to reproduce experimentally measured values for pure-solvent properties (<15% error from experiment) and free energy of solvation (±0.5 kcal/mol from experiment). Copyright © 2013 Wiley Periodicals, Inc.

  14. Interoperability in the OpenDreamKit Project: The Math-in-the-Middle Approach

    OpenAIRE

    Dehaye, Paul-Olivier; Kohlhase, Michael; Konovalov, Alexander; Lelièvre, Samuel; Pfeiffer, Markus; Thiéry, Nicolas M.

    2016-01-01

    OpenDreamKit - "Open Digital Research Environment Toolkit for the Advancement of Mathematics" - is an H2020 EU Research Infrastructure project that aims at supporting, over the period 2015-2019, the ecosystem of open-source mathematical software systems. OpenDreamKit will deliver a flexible toolkit enabling research groups to set up Virtual Research Environments, customised to meet the varied needs of research projects in pure mathematics and applications. An important step in the OpenDreamKi...

  15. Integrated bio-photonics to revolutionize health care enabled through PIX4life and PIXAPP

    Science.gov (United States)

    Jans, Hilde; O'Brien, Peter; Artundo, Iñigo; Porcel, Marco A. G.; Hoofman, Romano; Geuzebroek, Douwe; Dumon, Pieter; van der Vliet, Marcel; Witzens, Jeremy; Bourguignon, Eric; Van Dorpe, Pol; Lagae, Liesbet

    2018-02-01

    Photonics has become critical to life sciences. However, the field is far from benefiting fully from photonics' capabilities. Today, bulky and expensive optical systems dominate biomedical photonics, even though robust optical functionality can be realized cost-effectively on single photonic integrated circuits (PICs). Such chips are commercially available mostly for telecom applications, and at infrared wavelengths. Although proof-of-concept demonstrations for PICs in life sciences, using visible wavelengths are abundant, the gating factor for wider adoption is limited in resource capacity. Two European pilot lines, PIX4life and PIXAPP, were established to facilitate European R and D in biophotonics, by helping European companies and universities bridge the gap between research and industrial development. Through creation of an open-access model, PIX4life aims to lower barriers to entry for prototyping and validating biophotonics concepts for larger scale production. In addition, PIXAPP enables the assembly and packaging of photonic integrated circuits.

  16. PODIO: An Event-Data-Model Toolkit for High Energy Physics Experiments

    Science.gov (United States)

    Gaede, F.; Hegner, B.; Mato, P.

    2017-10-01

    PODIO is a C++ library that supports the automatic creation of event data models (EDMs) and efficient I/O code for HEP experiments. It is developed as a new EDM Toolkit for future particle physics experiments in the context of the AIDA2020 EU programme. Experience from LHC and the linear collider community shows that existing solutions partly suffer from overly complex data models with deep object-hierarchies or unfavorable I/O performance. The PODIO project was created in order to address these problems. PODIO is based on the idea of employing plain-old-data (POD) data structures wherever possible, while avoiding deep object-hierarchies and virtual inheritance. At the same time it provides the necessary high-level interface towards the developer physicist, such as the support for inter-object relations and automatic memory-management, as well as a Python interface. To simplify the creation of efficient data models PODIO employs code generation from a simple yaml-based markup language. In addition, it was developed with concurrency in mind in order to support the use of modern CPU features, for example giving basic support for vectorization techniques.

  17. Farm batch system and Fermi inter-process communication and synchronization toolkit

    International Nuclear Information System (INIS)

    Mandrichenko, I.V.

    2001-01-01

    Farms Batch System (FBS) was developed as a batch process management system for off-line Run II data processing at Fermilab. FBS will manage PC farms composed of up to 250 nodes and scalable to 1000 nodes with disk capacity of up to several TB. FBS allows users to start arrays of parallel processes on multiple computers. It uses a simplified resource counting method load balancing. FBS has been successfully used for more than a year at Fermilab by fixed target experiments and will be used for collider experiment off-line data processing. Fermi Inter-Process Communication toolkit (FIPC) was designed as a supplement product for FBS that helps establish synchronization and communication between processes running in a distributed batch environment. However, FIPC is an independent package, and can be used with other batch systems, as well as in a non-batch environment. FIPC provides users with a variety of global distributed objects such as semaphores, queues and string variables. Other types of objects can be easily added to FIPC. FIPC has been running on several PC farms at Fermilab for half a year and is going to be used by CDF for off-line data processing

  18. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  20. SimPackJ/S: a web-oriented toolkit for discrete event simulation

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2002-07-01

    SimPackJ/S is the JavaScript and Java version of SimPack, which means SimPackJ/S is a collection of JavaScript and Java libraries and executable programs for computer simulations. The main purpose of creating SimPackJ/S is that we allow existing SimPack users to expand simulation areas and provide future users with a freeware simulation toolkit to simulate and model a system in web environments. One of the goals for this paper is to introduce SimPackJ/S. The other goal is to propose translation rules for converting C to JavaScript and Java. Most parts demonstrate the translation rules with examples. In addition, we discuss a 3D dynamic system model and overview an approach to 3D dynamic systems using SimPackJ/S. We explain an interface between SimPackJ/S and the 3D language--Virtual Reality Modeling Language (VRML). This paper documents how to translate C to JavaScript and Java and how to utilize SimPackJ/S within a 3D web environment.