WorldWideScience

Sample records for assistant toolkit edat

  1. EURISOL Desktop Assistant Toolkit (EDAT): A modeling, simulation and visualization support to the preliminary radiological assessment of RIB projects

    Science.gov (United States)

    Vamanu, D.; Vamanu, B.; Acasandrei, V.; Maceika, E.; Plukis, A.

    2010-04-01

    The paper describes an approach taken within the EURISOL-DS project (European Isotope Separation Online Radioactive Ion Beam Facility) to a number of safety and radioprotection issues raised by the advent of radioactive ion beam facilities in the cutting edge area of particle accelerators. The ensuing solution emerged from a collaborative effort of the investigating team-in-charge, affiliated with the Horia Hulubei National Institute of Physics and Nuclear Engineering in Bucharest, with expert colleagues at the Physics Institute in Vilnius, and at CERN, within the participation in the EURISOL-DS project, Sub-Task B: Radiation, Activation, Shielding and Doses of the Safety and Radioprotection, Task 5. The work was primarily geared towards the identification of knowledge and data in line with validated, accepted and nationally/internationally recommended methods and models of radiological assessment applied within the nuclear power fuel cycle, deemed to be suitable for assessing health and environmental impact of accelerator operations as well. As a result, a computer software platform code-named “EURISOL Desktop Assistant Toolkit” was developed. The software is, inter alia, capable to assess radiation doses from pure or isotopically mixed open or shielded point sources; emergency response-relevant doses; critical group doses via complex pathways, including the air, the water, and the food chain and derived release limits for the normal, routine operations of nuclear facilities. Dedicated data libraries and GIS (Geographic Information System) facilities assist the input/output operations.

  2. 'EURISOL Desktop Assistant Toolkit' (EDAT): a modeling, simulation and visualization support to the preliminary radiological assessment of RIB projects

    CERN Document Server

    Vamanu, D; Acasandrei, V; Plukis, A; Maceika, E

    The paper describes an approach taken within EURISOL-DS project (European Isotope Separation On-Line Radioactive Ion Beam Facility) to a number of safety and radioprotection issues raised by the advent of radioactive ion beam facilities in the cutting edge area of particle accelerators. The ensuing solution emerged from a collaborative effort of the investigating team-in-charge, affiliated with ‘Horia Hulubei’ National Institute of Physics and Nuclear Engineering in Bucharest, with expert colleagues at the Physics Institute in Vilnius, and at CERN.

  3. Using an Assistive Technology Toolkit to Promote Inclusion

    Science.gov (United States)

    Judge, Sharon; Floyd, Kim; Jeffs, Tara

    2008-01-01

    Although the use of assistive technology for young children is increasing, the lack of awareness and the lack of training continue to act as major barriers to providers using assistive technology. This article describes an assistive technology toolkit designed for use with young children with disabilities that can be easily assembled and…

  4. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  5. L'espai privat a l'edat mitjana

    OpenAIRE

    Riquer, Isabel de

    1990-01-01

    Assaig de la filòloga Isabel de Riquer sobre els espais i l'habitat privat que envoltaren la vida quotidiana a l'edat mitjana, a partir de les dades que ofereix la iconografia dels segles XII i XIII, "perquè les investigacions arqueològiques sobre els edificis civils de l'alta edat mitjana no ens han permès esbrinar gran cosa".

  6. MITK-OpenIGTLink for combining open-source toolkits in real-time computer-assisted interventions.

    Science.gov (United States)

    Klemm, Martin; Kirchner, Thomas; Gröhl, Janek; Cheray, Dominique; Nolden, Marco; Seitel, Alexander; Hoppe, Harald; Maier-Hein, Lena; Franz, Alfred M

    2017-03-01

    Due to rapid developments in the research areas of medical imaging, medical image processing and robotics, computer-assisted interventions (CAI) are becoming an integral part of modern patient care. From a software engineering point of view, these systems are highly complex and research can benefit greatly from reusing software components. This is supported by a number of open-source toolkits for medical imaging and CAI such as the medical imaging interaction toolkit (MITK), the public software library for ultrasound imaging research (PLUS) and 3D Slicer. An independent inter-toolkit communication such as the open image-guided therapy link (OpenIGTLink) can be used to combine the advantages of these toolkits and enable an easier realization of a clinical CAI workflow. MITK-OpenIGTLink is presented as a network interface within MITK that allows easy to use, asynchronous two-way messaging between MITK and clinical devices or other toolkits. Performance and interoperability tests with MITK-OpenIGTLink were carried out considering the whole CAI workflow from data acquisition over processing to visualization. We present how MITK-OpenIGTLink can be applied in different usage scenarios. In performance tests, tracking data were transmitted with a frame rate of up to 1000 Hz and a latency of 2.81 ms. Transmission of images with typical ultrasound (US) and greyscale high-definition (HD) resolutions of [Formula: see text] and [Formula: see text] is possible at up to 512 and 128 Hz, respectively. With the integration of OpenIGTLink into MITK, this protocol is now supported by all established open-source toolkits in the field. This eases interoperability between MITK and toolkits such as PLUS or 3D Slicer and facilitates cross-toolkit research collaborations. MITK and its submodule MITK-OpenIGTLink are provided open source under a BSD-style licence ( http://mitk.org ).

  7. iPhos: a toolkit to streamline the alkaline phosphatase-assisted comprehensive LC-MS phosphoproteome investigation.

    Science.gov (United States)

    Yang, Tzu-Hsien; Chang, Hong-Tsun; Hsiao, Eric Sl; Sun, Juo-Ling; Wang, Chung-Ching; Wu, Hsin-Yi; Liao, Pao-Chi; Wu, Wei-Sheng

    2014-01-01

    Comprehensive characterization of the phosphoproteome in living cells is critical in signal transduction research. But the low abundance of phosphopeptides among the total proteome in cells remains an obstacle in mass spectrometry-based proteomic analysis. To provide a solution, an alternative analytic strategy to confidently identify phosphorylated peptides by using the alkaline phosphatase (AP) treatment combined with high-resolution mass spectrometry was provided. While the process is applicable, the key integration along the pipeline was mostly done by tedious manual work. We developed a software toolkit, iPhos, to facilitate and streamline the work-flow of AP-assisted phosphoproteome characterization. The iPhos tookit includes one assister and three modules. The iPhos Peak Extraction Assister automates the batch mode peak extraction for multiple liquid chromatography mass spectrometry (LC-MS) runs. iPhos Module-1 can process the peak lists extracted from the LC-MS analyses derived from the original and dephosphorylated samples to mine out potential phosphorylated peptide signals based on mass shift caused by the loss of some multiples of phosphate groups. And iPhos Module-2 provides customized inclusion lists with peak retention time windows for subsequent targeted LC-MS/MS experiments. Finally, iPhos Module-3 facilitates to link the peptide identifications from protein search engines to the quantification results from pattern-based label-free quantification tools. We further demonstrated the utility of the iPhos toolkit on the data of human metastatic lung cancer cells (CL1-5). In the comparison study of the control group of CL1-5 cell lysates and the treatment group of dasatinib-treated CL1-5 cell lysates, we demonstrated the applicability of the iPhos toolkit and reported the experimental results based on the iPhos-facilitated phosphoproteome investigation. And further, we also compared the strategy with pure DDA-based LC-MS/MS phosphoproteome investigation

  8. Constructing an Assistive Technology Toolkit for Young Children: Views from the Field

    Science.gov (United States)

    Judge, Sharon

    2006-01-01

    Assistive technology is guaranteed by law to be included when appropriate on individualized education plans (IEP) for young children with disabilities. Yet, the full potential of technology remains unfulfilled due to insufficient knowledge of options available, limited professional development, and a dearth of evidence on its effectiveness for…

  9. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  10. A guide and toolkit: green infrastructure

    OpenAIRE

    Amion Consulting

    2008-01-01

    Toolkit to support the development and implementation of projects aimed at enhancing or protecting green infrastructure. Provides a rationale for investment in green infrastructure and an evaluation framework for assisting learning and decision making.

  11. TOOLKIT, Version 2. 0

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  12. Perl Template Toolkit

    CERN Document Server

    Chamberlain, Darren; Cross, David; Torkington, Nathan; Diaz, tatiana Apandi

    2004-01-01

    Among the many different approaches to "templating" with Perl--such as Embperl, Mason, HTML::Template, and hundreds of other lesser known systems--the Template Toolkit is widely recognized as one of the most versatile. Like other templating systems, the Template Toolkit allows programmers to embed Perl code and custom macros into HTML documents in order to create customized documents on the fly. But unlike the others, the Template Toolkit is as facile at producing HTML as it is at producing XML, PDF, or any other output format. And because it has its own simple templating language, templates

  13. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  14. Energy Conservation Behaviour Toolkit

    NARCIS (Netherlands)

    Kalz, Marco; Börner, Dirk; Ternier, Stefaan; Specht, Marcus

    2013-01-01

    Kalz, M., Börner, D., Ternier, S., & Specht, M. (2013, 31 January). Energy Conservation Behaviour Toolkit. Presentation given at the symposium "Groene ICT en Duurzame ontwikkeling: Meters maken in het Hoger Onderwijs", Driebergen, The Netherlands.

  15. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  16. The Einstein Toolkit

    Science.gov (United States)

    Löffler, Frank

    2012-03-01

    The Einstein Toolkit Consortium is developing and supporting open software for relativistic astrophysics. Its aim is to provide the core computational tools that can enable new science, broaden our community, facilitate interdisciplinary research and take advantage of petascale computers and advanced cyberinfrastructure. The Einstein Toolkit currently consists of an open set of over 100 modules for the Cactus framework, primarily for computational relativity along with associated tools for simulation management and visualization. The toolkit includes solvers for vacuum spacetimes as well as relativistic magneto-hydrodynamics, along with modules for initial data, analysis and computational infrastructure. These modules have been developed and improved over many years by many different researchers. The Einstein Toolkit is supported by a distributed model, combining core support of software, tools, and documentation in its own repositories and through partnerships with other developers who contribute open software and coordinate together on development. As of January 2012 it has 68 registered members from 30 research groups world-wide. This talk will present the current capabilities of the Einstein Toolkit and will point to information how to leverage it for future research.

  17. Local Foods, Local Places Toolkit

    Science.gov (United States)

    Toolkit to help communities that want to use local foods to spur revitalization. The toolkit gives step-by-step instructions to help communities plan and host a workshop and create an action plan to implement.

  18. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  19. Fragment Impact Toolkit (FIT)

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel Wolf [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Key, Brian P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garcia, Daniel B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  20. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We compare hand-crafted custom code to polylithic and monolithic toolkit-based solutions. Polylithic toolkits follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor...

  1. A Racial Equity Toolkit for Midwifery Organizations.

    Science.gov (United States)

    Gordon, Wendy M

    2016-11-01

    Midwifery associations are increasing awareness and commitment to racial equity in the profession and in the communities we serve. Moving these commitments from words into action may be facilitated by a racial equity toolkit to help guide midwifery organizations to consider all policies, initiatives, and actions with a racial equity lens. Racial equity impact analyses have been used in recent years by various governmental agencies in the United States and abroad with positive results, and emerging literature indicates that nonprofit organizations are having similarly positive results. This article proposes a framework for midwifery organizations to incorporate a racial equity toolkit, starting with explicit intentions of the organization with regard to racial equity in the profession. Indicators of success are elucidated as the next step, followed by the use of a racial equity impact analysis worksheet. This worksheet is applied by teams or committees when considering new policies or initiatives to examine those actions through a racial equity lens. An organizational change team and equity advisory groups are essential in assisting organizational leadership to forecast potential negative and positive impacts. Examples of the components of a midwifery-specific racial equity toolkit are included. © 2016 by the American College of Nurse-Midwives.

  2. Development of a Multimedia Toolkit for Engineering Graphics Education

    Directory of Open Access Journals (Sweden)

    Moudar Zgoul

    2009-09-01

    Full Text Available This paper focuses upon the development of a multimedia toolkit to support the teaching of Engineering Graphics Course. The project used different elements for the toolkit; animations, videos and presentations which were then integrated in a dedicated internet website. The purpose of using these elements is to assist the students building and practicing the needed engineering skills at their own pace as a part of an e-Learning solution. Furthermore, this kit allows students to repeat and view the processes and techniques of graphical construction, and visualization as much as needed, allowing them to follow and practice on their own.

  3. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  4. Third Party TMDL Development Toolkit

    Science.gov (United States)

    Water Environment Federation's toolkit provides basic steps in which an organization or group other than the lead water quality agency takes responsibility for developing the TMDL document and supporting analysis.

  5. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  6. BIT: Biosignal Igniter Toolkit.

    Science.gov (United States)

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  8. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    Science.gov (United States)

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  9. Liaison Officer Toolkit

    Science.gov (United States)

    2010-01-01

    sustaining dental care designed to prevent or intercept potential dental emergencies, and limited preventive dentistry • Patient holding for up to 40... Veterinary Service Support 7.2.5.1 Mission To provide dispersed Veterinary Roles 1 and 2 medical and resuscitative surgical care; Veterinary Role 3... Veterinary support to foreign humanitarian assistance programs • Coordination with supported logistical organizations for food safety support and

  10. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  11. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  12. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  13. Diferències de gènere i edats en la tipologia d’estressors laborals docents Gender and age differences in the typology of stress factors affecting teachers Diferencias de género y edades en la tipología de estresores laborales docentes

    National Research Council Canada - National Science Library

    Marian Baqués i Trenchs; Joan Riart i Vendrell; Carles Virgili Tejedor

    2011-01-01

    El propòsit d’aquest article és presentar els resultats d’una investigació sobre com afecta el gènere i l’edat en el nivell d’estrès dels docents. La hipòtesi de sortida és que hi ha força relació...

  14. WIST: toolkit for rapid, customized LIMS development

    National Research Council Canada - National Science Library

    Huang, Y Wayne; Arkin, Adam P; Chandonia, John-Marc

    2011-01-01

    Workflow Information Storage Toolkit (WIST) is a set of application programming interfaces and web applications that allow for the rapid development of customized laboratory information management systems (LIMS...

  15. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims to communi......The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...

  16. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  17. Google Web Toolkit for Ajax

    CERN Document Server

    Perry, Bruce

    2007-01-01

    The Google Web Toolkit (GWT) is a nifty framework that Java programmers can use to create Ajax applications. The GWT allows you to create an Ajax application in your favorite IDE, such as IntelliJ IDEA or Eclipse, using paradigms and mechanisms similar to programming a Java Swing application. After you code the application in Java, the GWT's tools generate the JavaScript code the application needs. You can also use typical Java project tools such as JUnit and Ant when creating GWT applications. The GWT is a free download, and you can freely distribute the client- and server-side code you c

  18. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  19. A toolkit modeling approach for sustainable forest management planning: achieving balance between science and local needs

    Science.gov (United States)

    Brian R. Sturtevant; Andrew Fall; Daniel D. Kneeshaw; Neal P. P. Simon; Michael J. Papaik; Kati Berninger; Frederik Doyon; Don G. Morgan; Christian Messier

    2007-01-01

    To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM). The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and...

  20. The Mentoring Toolkit 2.0: Resources for Developing Programs for Incarcerated Youth. Guide

    Science.gov (United States)

    Zaugg, Nathan; Jarjoura, Roger

    2017-01-01

    "The Mentoring Toolkit 2.0: Resources for Developing Programs for Incarcerated Youth" provides information, program descriptions, and links to important resources that can assist juvenile correctional facilities and other organizations to design effective mentoring programs for neglected and delinquent youth, particularly those who are…

  1. toolkit

    Directory of Open Access Journals (Sweden)

    Sanaa M. Aly

    2016-07-01

    Full Text Available The DNA analysis is a cornerstone in contemporary forensic sciences. DNA sequencing technologies are powerful tools that enrich molecular sciences in the past based on Sanger sequencing and continue to glowing these sciences based on Next generation sequencing (NGS. Next generation sequencing has excellent potential to flourish and increase the molecular applications in forensic sciences by jumping over the pitfalls of the conventional method of sequencing. The main advantages of NGS compared to conventional method that it utilizes simultaneously a large number of genetic markers with high-resolution of genetic data. These advantages will help in solving several challenges such as mixture analysis and dealing with minute degraded samples. Based on these new technologies, many markers could be examined to get important biological data such as age, geographical origins, tissue type determination, external visible traits and monozygotic twins identification. It also could get data related to microbes, insects, plants and soil which are of great medico-legal importance. Despite the dozens of forensic research involving NGS, there are requirements before using this technology routinely in forensic cases. Thus, there is a great need to more studies that address robustness of these techniques. Therefore, this work highlights the applications of forensic sciences in the era of massively parallel sequencing.

  2. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  3. AKT: ancestry and kinship toolkit.

    Science.gov (United States)

    Arthur, Rudy; Schulz-Trieglaff, Ole; Cox, Anthony J; O'Connell, Jared

    2017-01-01

    Ancestry and Kinship Toolkit (AKT) is a statistical genetics tool for analysing large cohorts of whole-genome sequenced samples. It can rapidly detect related samples, characterize sample ancestry, calculate correlation between variants, check Mendel consistency and perform data clustering. AKT brings together the functionality of many state-of-the-art methods, with a focus on speed and a unified interface. We believe it will be an invaluable tool for the curation of large WGS datasets. The source code is available at https://illumina.github.io/akt CONTACTS: joconnell@illumina.com or rudy.d.arthur@gmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. A Geospatial Decision Support System Toolkit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to design a working prototype Geospatial Decision Support Toolkit (GeoKit) that will enable scientists, agencies, and stakeholders to configure and deploy...

  5. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  6. Water Quality Trading Toolkit for Permit Writers

    Science.gov (United States)

    The Water Quality Trading Toolkit for Permit Writers is EPA’s first “how-to” manual on designing and implementing water quality trading programs. It helps NPDES permitting authorities incorporate trading provisions into permits.

  7. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  8. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  9. The Weather and Climate Toolkit

    Science.gov (United States)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  10. The Topology ToolKit.

    Science.gov (United States)

    Tierny, Julien; Favelier, Guillaume; Levine, Joshua A; Gueunet, Charles; Michaux, Michael

    2017-08-29

    This system paper presents the Topology ToolKit (TTK), a software platform designed for the topological analysis of scalar data in scientific visualization. While topological data analysis has gained in popularity over the last two decades, it has not yet been widely adopted as a standard data analysis tool for end users or developers. TTK aims at addressing this problem by providing a unified, generic, efficient, and robust implementation of key algorithms for the topological analysis of scalar data, including: critical points, integral lines, persistence diagrams, persistence curves, merge trees, contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots, Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due to a tight integration with ParaView. It is also easily accessible to developers through a variety of bindings (Python, VTK/C++) for fast prototyping or through direct, dependency-free, C++, to ease integration into pre-existing complex systems. While developing TTK, we faced several algorithmic and software engineering challenges, which we document in this paper. In particular, we present an algorithm for the construction of a discrete gradient that complies to the critical points extracted in the piecewise-linear setting. This algorithm guarantees a combinatorial consistency across the topological abstractions supported by TTK, and importantly, a unified implementation of topological data simplification for multi-scale exploration and analysis. We also present a cached triangulation data structure, that supports time efficient and generic traversals, which self-adjusts its memory usage on demand for input simplicial meshes and which implicitly emulates a triangulation for regular grids with no memory overhead. Finally, we describe an original software architecture, which guarantees memory efficient and direct accesses to TTK features, while still allowing for researchers powerful and easy bindings and extensions

  11. Developing an evidence-based, multimedia group counseling curriculum toolkit.

    Science.gov (United States)

    Brooks, Adam C; Diguiseppi, Graham; Laudet, Alexandre; Rosenwasser, Beth; Knoblach, Dan; Carpenedo, Carolyn M; Carise, Deni; Kirby, Kimberly C

    2012-09-01

    Training community-based addiction counselors in empirically supported treatments (ESTs) far exceeds the ever-decreasing resources of publicly funded treatment agencies. This feasibility study describes the development and pilot testing of a group counseling toolkit (an approach adapted from the education field) focused on relapse prevention (RP). When counselors (N = 17) used the RP toolkit after 3 hours of training, their content adherence scores on "coping with craving" and "drug refusal skills" showed significant improvement, as indicated by very large effect sizes (Cohen's d = 1.49 and 1.34, respectively). Counselor skillfulness, in the "adequate-to-average" range at baseline, did not change. Although this feasibility study indicates some benefit to counselor EST acquisition, it is important to note that the impact of the curriculum on client outcomes is unknown. Because a majority of addiction treatment is delivered in group format, a multimedia curriculum approach may assist counselors in applying ESTs in the context of actual service delivery. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Communities and Spontaneous Urban Planning: A Toolkit for Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The results will be incorporated into an organizational and urban planning toolkit comprising printed and audio-visual materials and mobile applications. This toolkit is expected to be a viable alternative for planning urban expansion wherever it cannot be carried out through traditional means. The toolkit will be tested in Dar ...

  13. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  14. Anchor Toolkit - a secure mobile agent system

    Energy Technology Data Exchange (ETDEWEB)

    Mudumbai, Srilekha S.; Johnston, William; Essiari, Abdelilah

    1999-05-19

    Mobile agent technology facilitates intelligent operation insoftware systems with less human interaction. Major challenge todeployment of mobile agents include secure transmission of agents andpreventing unauthorized access to resources between interacting systems,as either hosts, or agents, or both can act maliciously. The Anchortoolkit, designed by LBNL, handles the transmission and secure managementof mobile agents in a heterogeneous distributed computing environment. Itprovides users with the option of incorporating their security managers.This paper concentrates on the architecture, features, access control anddeployment of Anchor toolkit. Application of this toolkit in a securedistributed CVS environment is discussed as a case study.

  15. Türkiye Türkçesinde "gibi" Edatıyla Kurulan Benzetmeli Anlatımlar The Expressions of Simile with the Postposition of "gibi" in Turkey Turkish

    Directory of Open Access Journals (Sweden)

    İ. Gülsel SEV

    2012-12-01

    çerisinde yer alan benzetmelere başvurmaktır. Benzetme, niteliği anlatılmak istenen nesnenin veya varlığın bir başka nesneye ya da varlığa dayanılarak onunla benzerliği ortaya konarak anlatılmasıdır. Bu eğilim dilimizde daha çok benzetme edatı diye adlandırılan gibi edatıyla, edat grubu niteliğinde, gerçekleştirilir. duvar gibi: Sağır, fitil gibi: Çok sarhoş, peri gibi: Çok güzel, kıyamet gibi: Pek çok, ay parçası gibi: Güzel kız, çocuk, gâvur ölüsü gibi: Çok ağır, hantal, kör değneğini beller gibi: Hep aynı işi yapmak. vb. örneklerde olduğu gibi. Söz konusu kelime grupları fiillerle birlikte kullanıldığında (isim-fiil grubu kalıbında deyim niteliği kazandığı açıktır. Kedi ciğere bakar gibi bak-: İmrenerek bakmak, arpacı kumrusu gibi düşün-: Ne yapacağını bilmeyerek derin derin düşünmek, pişmiş kelle gibi sırıt-: Dişlerini göstererek yersiz ve aptalca gülmek. vb. örneklerde olduğu gibi.Bu yazıda Türk dilinde yaygın olarak kullanılagelen gibi edatıyla kurulan kalıplaşmış söz öbeklerinin işleyişi üzerinde durulacaktır. Çalışmamızın esas konusunu dilimizde fiilsiz kullanılan gibi edatlı kalıplaşmış kelime grupları teşkil ettiğinden bu yapılar için TDK Türkçe Sözlük taranarak söz konusu gruplar çıkarılacak; bunlar anlam ilişkileri, kalıplaşma dereceleri, yapıları vb. bakımından incelenmeye çalışılacaktır.

  16. Creating Centralized Reporting for Microsoft Host Protection Technologies:The Enhanced Mitigation Experience Toolkit (EMET)

    Science.gov (United States)

    2016-08-11

    copyright and “No Warranty” statements are included with all reproductions and derivative works. External use:* This material may be reproduced in its...endpoints from compromise. Microsoft offers a tool to assist in this area and is pro- vided at no cost. The Enhanced Mitigation Experience Toolkit (EMET...AppLocker can still be exploited. EMET provides an additional layer of protection by restricting techniques commonly used by malicious actors. EMET can help

  17. Integrating surgical robots into the next medical toolkit.

    Science.gov (United States)

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  18. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  19. Karma: Visualisation Test-Bed Toolkit

    Science.gov (United States)

    Gooch, Richard

    2011-02-01

    Karma is a toolkit for interprocess communications, authentication, encryption, graphics display, user interface and manipulating the Karma network data structure. It contains KarmaLib (the structured libraries and API) and a large number of modules (applications) to perform many standard tasks. A suite of visualisation tools are distributed with the library.

  20. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  1. Integrated System Health Management Development Toolkit

    Science.gov (United States)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  2. The health technology assessment adaptation toolkit: description and use.

    Science.gov (United States)

    Turner, Sheila; Chase, Deborah L; Milne, Ruairidh; Cook, Andrew; Hicks, Nicholas J; Rosten, Claire; Payne, Liz; Coles, Suzanne; Bell, Eleanor

    2009-12-01

    Adapting health technology assessment (HTA) reports for different contexts could reduce the need for multiple reports on the same health technology with resultant saving of time and resources. This article describes an instrument, the adaptation toolkit, which has been developed to aid in the process of adaptation of HTA reports. The toolkit was developed by a partnership of HTA agencies and networks from across Europe. The role of the toolkit is to guide the user through the process of selecting possible relevant material from these report(s), assessing the relevance, reliability, and transferability of the material, and adapting it for the desired context. The adaptation toolkit has been developed, it comprises a collection of resources that help the user assess whether data and information in existing HTA reports should and could be adapted for their own setting. The toolkit contains two sections: a preliminary speedy sifting section and the main toolkit. The main toolkit includes five domains: (i) technology use and development, (ii) safety, (iii) effectiveness (including efficacy), (iv) economic evaluation, and (v) organizational aspects. Legal, ethical, and social aspects are beyond the scope of the toolkit. The toolkit is designed for the adaptation of evidence synthesis rather than primary research. The completed current version of the toolkit contains checklists and resources to aid in the adaptation of HTA reports. This collection of resources is available for use by all HTA agencies and can be accessed at: http://www.eunethta.net/upload/WP5/EUnetHTA_HTA_Adaptation_Toolkit_October08.pdf..

  3. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Science.gov (United States)

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-01-01

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit. PMID:28241502

  4. The Interactive Learning Toolkit: supporting interactive classrooms

    Science.gov (United States)

    Dutta, S.; McCauley, V.; Mazur, E.

    2004-05-01

    Research-based interactive learning techniques have dramatically improved student understanding. We have created the 'Interactive Learning Toolkit' (ILT), a web-based learning management system, to help implement two such pedagogies: Just in Time Teaching and Peer Instruction. Our main goal in developing this toolkit is to save the instructor time and effort and to use technology to facilitate the interaction between the students and the instructor (and between students themselves). After a brief review of both pedagogies, we will demonstrate the many exciting new features of the ILT. We will show how technology can not only implement, but also supplement and improve these pedagogies. We would like acknowdge grants from NSF and DEAS, Harvard University

  5. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  6. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  7. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  8. ECCE Toolkit: Prototyping Sensor-Based Interaction.

    Science.gov (United States)

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-02-23

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  9. Mobile-Assisted Vocabulary Learning: A Review Study

    National Research Council Canada - National Science Library

    Parichehr Afzali; Somayeh Shabani; Zohreh Basir; Mohammad Ramazani

    2017-01-01

    Mobile phones are becoming more acceptable toolkits to learn languages. One aspect of English language which has been subject to investigation in mobile assisted language learning (MALL) is vocabulary...

  10. chemf: A purely functional chemistry toolkit.

    Science.gov (United States)

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  11. VIDE: The Void IDentification and Examination toolkit

    Science.gov (United States)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N -body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at

  12. Methods for Evaluating Text Extraction Toolkits: An Exploratory Investigation

    Science.gov (United States)

    2015-01-22

    SNAPSHOT extracted “a b c d f”. Borrowing technical terms from the field of corpus linguistics , we would say that the text extracted by Tika 1.5 had 8...Although this effort focuses on the popular open source Apache Tika toolkit and the govdocs1 corpus , the method generally applies to other text...a text extraction toolkit. Although this effort focuses on the popular open source Apache Tika toolkit and the govdocs1 corpus , the method generally

  13. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  14. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  15. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  16. OGSA Globus Toolkits evaluation activity at CERN

    CERN Document Server

    Chen, D; Foster, D; Kalyaev, V; Kryukov, A; Lamanna, M; Pose, V; Rocha, R; Wang, C

    2004-01-01

    An Open Grid Service Architecture (OGSA) Globus Toolkit 3 (GT3) evaluation group is active at CERN since GT3 was available in early beta version (Spring 2003). This activity focuses on the evaluation of the technology as promised by the OGSA/OGSI paradigm and on GT3 in particular. The goal is to study this new technology and its implications with the goal to provide useful input for the large grid initiatives active in the LHC Computing Grid (LCG) project. A particular effort has been devoted to investigate performance and deployment issues, having in mind the LCG requirements, in particular scalability and robustness.

  17. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  18. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  19. Emergency Survey Toolkit for Naval Operations

    Directory of Open Access Journals (Sweden)

    V. P. da ConceiçÌo

    2015-09-01

    Full Text Available In order to deliver the minimum safety conditions for movement of the ships towards restricted waters, urgent survey operations are required whenever we deal with natural disasters; unreliable chart information or uncharted areas requires. In recent years, there has been a huge development in positioning and survey technology. Simultaneously, charts production techniques and GIS software are easily accessible. In this circumstance, a research project was made in order to assess the possibility of developing existing capabilities of emergency hydrographic survey. The toolkit was designed to allow the swift production of usable bottom representation and survey of navigational aids, with a focus on navigational safety, rather than bottom contour accuracy.

  20. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  1. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  2. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Science.gov (United States)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  3. 77 FR 45337 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-07-31

    ... environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will... seeks to export, goods or services produced in the United States * * *.'' An expression of interest in... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U...

  4. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will... ] services produced in the United States * * *.'' An expression of interest in being listed on the Toolkit... users in foreign markets to U.S. approaches to solving environmental problems and to U.S. companies that...

  5. The Semi-Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Directory of Open Access Journals (Sweden)

    C.S. Ierotheou

    2001-01-01

    Full Text Available The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.

  6. The Einstein Toolkit: A Community Computational Infrastructure for Relativistic Astrophysics

    CERN Document Server

    Löffler, Frank; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C; Ott, Christian D; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2011-01-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The Toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus Framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general-relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this article. We discuss the motivation behind the release of the toolkit, the philosophy underlying its de...

  7. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  8. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  9. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    Science.gov (United States)

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  10. Diagnosing turbulence for research aircraft safety using open source toolkits

    Directory of Open Access Journals (Sweden)

    T.J. Lang

    Full Text Available Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.

  11. Demonstration of the Health Literacy Universal Precautions Toolkit

    Science.gov (United States)

    Mabachi, Natabhona M.; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G.; Albright, Karen; Weiss, Barry D.; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements. PMID:27232681

  12. Climate Change Toolkit-Case study: Switzerland

    Science.gov (United States)

    Ashraf Vaghefi, Saeid

    2017-04-01

    This paper describes the development of a Climate Change Toolkit (CCT) to rapidly perform tasks needed in a climate change study. CCT consists of five modules: data extraction, global climate data management, bias correction, spatial interpolation, and critical consecutive day analyzer to calculate extreme events. CCT is linked to an archive of big dataset consisting of daily global historic (CRU, 1970-2005), and global GCM data (1960-2099) from 5 models and 4 carbon scenarios. Application of CCT in Switzerland using ensemble results of scenario RCP8.5 showed an increase in Max temperature, and a wide change in precipitation. Frequency of dry periods will likely increase. The frequency of wet periods suggests higher risk of flooding in the country.

  13. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  14. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  15. Food: Too Good to Waste Implementation Guide and Toolkit

    Science.gov (United States)

    The Food: Too Good to Waste (FTGTW) Implementation Guide and Toolkit is designed for community organizations, local governments, households and others interested in reducing wasteful household food management practices.

  16. The TeleEngineering Toolkit Software Reference Manual

    National Research Council Canada - National Science Library

    Jorgeson, Jeffrey D; Berry, Woodman W; Taylor, Rhonda D; Fairley, Sandra K; Jackson, Jill M; Williamson, Jeffrey L; Webb, Benjamin T

    2007-01-01

    The TeleEngineering Toolkit software was developed to provide a mechanism by which deployed engineers could view and analyze geospatial data, collect and display data required for engineering analyses...

  17. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  18. The library innovation toolkit: ideas, strategies, and programs

    National Research Council Canada - National Science Library

    Molaro, Anthony; White, Leah L; Lankes, David R

    2015-01-01

    .... Among the topics covered are: the importance of creating organizational structures that lead to innovation, strategies for getting library staff and other stakeholders on board and engaged, complete with a step-by-step toolkit...

  19. The AAV vector toolkit: poised at the clinical crossroads

    National Research Council Canada - National Science Library

    Asokan, Aravind; Schaffer, David V; Samulski, R Jude

    2012-01-01

    The discovery of naturally occurring adeno-associated virus (AAV) isolates in different animal species and the generation of engineered AAV strains using molecular genetics tools have yielded a versatile AAV vector toolkit...

  20. Design and Implementation of a Learning Analytics Toolkit for Teachers

    National Research Council Canada - National Science Library

    Anna Lea Dyckhoff; Dennis Zielke; Mareike Bültmann; Mohamed Amine Chatti; Ulrik Schroeder

    2012-01-01

    .... In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics Toolkit, which enables teachers to explore and correlate learning...

  1. The CRISM Analysis Toolkit (CAT): Overview and Recent Updates

    Science.gov (United States)

    Morgan, M. F.; Seelos, F. P.; Murchie, S. L.

    2017-06-01

    The CRISM Analysis Toolkit (CAT) is an IDL/ENVI-based software system for analyzing and displaying data from the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM). We will describe CAT’s capabilities and discuss recent updates.

  2. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  3. A Multi-Physics CFD Toolkit for Reentry Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — AeroSoft proposes to develop a full featured CFD toolkit for analysis of the aerothermal environment and its effect on space vehicles. In Phase I, AeroSoft proposes...

  4. Developing a toolkit for panel management: improving hypertension and smoking cessation outcomes in primary care at the VA.

    Science.gov (United States)

    Savarimuthu, Stella M; Jensen, Ashley E; Schoenthaler, Antoinette; Dembitzer, Anne; Tenner, Craig; Gillespie, Colleen; Schwartz, Mark D; Sherman, Scott E

    2013-11-21

    As primary care practices evolve into medical homes, there is an increasing need for effective models to shift from visit-based to population-based strategies for care. However, most medical teams lack tools and training to manage panels of patients. As part of a study comparing different approaches to panel management at the Manhattan and Brooklyn campuses of the VA New York Harbor Healthcare System, we created a toolkit of strategies that non-clinician panel management assistants (PMAs) can use to enhance panel-wide outcomes in smoking cessation and hypertension. We created the toolkit using: 1) literature review and consultation with outside experts, 2) key informant interviews with staff identified using snowball sampling, 3) pilot testing for feasibility and acceptability, and 4) further revision based on a survey of primary care providers and nurses. These steps resulted in progressively refined strategies for the PMAs to support the primary care team. Literature review and expert consultation resulted in an extensive list of potentially useful strategies. Key informant interviews and staff surveys identified several areas of need for assistance, including help to manage the most challenging patients, providing care outside of the visit, connecting patients with existing resources, and providing additional patient education. The strategies identified were then grouped into 5 areas - continuous connection to care, education and connection to clinical resources, targeted behavior change counseling, adherence support, and patients with special needs. Although panel management is a central aspect of patient-centered medical homes, providers and health care systems have little guidance or evidence as to how teams should accomplish this objective. We created a toolkit to help PMAs support the clinical care team for patients with hypertension or tobacco use. This toolkit development process could readily be adapted to other behaviors or conditions. Clinical

  5. A toolkit to assess Medical Reserve Corps units' performance.

    Science.gov (United States)

    Savoia, Elena; Massin-Short, Sarah; Higdon, Melissa Ann; Tallon, Lindsay; Matechi, Emmanuel; Stoto, Michael A

    2010-10-01

    The Medical Reserve Corps (MRC) is a national network of community-based units created to promote the local identification, recruitment, training, and activation of volunteers to assist local health departments in public health activities. This study aimed to develop a toolkit for MRC coordinators to assess and monitor volunteer units' performance and identify barriers limiting volunteerism. In 2008 and 2009, MRC volunteers asked to participate in influenza clinics were surveyed in 7 different locations throughout the United States. Two survey instruments were used to assess the performance of the volunteers who were able to participate, the specific barriers that prevented some volunteers from participating, and the overall attitudes of those who participated and those who did not. Validity and reliability of the instruments were assessed through the use of factor analysis and Cronbach's alpha. Two survey instruments were developed: the Volunteer Self-Assessment Questionnaire and the Barriers to Volunteering Questionnaire. Data were collected from a total of 1059 subjects, 758 participated in the influenza clinics and 301 were unable to attend. Data from the 2 instruments were determined to be suitable for factor analysis. Factor solutions and inter-item correlations supported the hypothesized domain structure for both survey questionnaires. Results on volunteers' performance were consistent with observations of both local health departments' staff and external observers. The survey instruments developed for this study appear to be valid and reliable means to assess the performance and attitudes of MRC volunteers and barriers to their participation. This study found these instruments to have face and content validity and practicality. MRC coordinators can use these questionnaires to monitor their ability to engage volunteers in public health activities.

  6. VaST: A variability search toolkit

    Science.gov (United States)

    Sokolovsky, K. V.; Lebedev, A. A.

    2018-01-01

    Variability Search Toolkit (VaST) is a software package designed to find variable objects in a series of sky images. It can be run from a script or interactively using its graphical interface. VaST relies on source list matching as opposed to image subtraction. SExtractor is used to generate source lists and perform aperture or PSF-fitting photometry (with PSFEx). Variability indices that characterize scatter and smoothness of a lightcurve are computed for all objects. Candidate variables are identified as objects having high variability index values compared to other objects of similar brightness. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and handle complex image distortions. The software has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5 m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates. About 1800 variable stars have been discovered with VaST. It is used as a transient detection engine in the New Milky Way (NMW) nova patrol. The code is written in C and can be easily compiled on the majority of UNIX-like systems. VaST is free software available at http://scan.sai.msu.ru/vast/.

  7. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  8. Security Assessment Simulation Toolkit (SAST) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  9. Versatile RNA tetra-U helix linking motif as a toolkit for nucleic acid nanotechnology.

    Science.gov (United States)

    Bui, My N; Brittany Johnson, M; Viard, Mathias; Satterwhite, Emily; Martins, Angelica N; Li, Zhihai; Marriott, Ian; Afonin, Kirill A; Khisamutdinov, Emil F

    2017-04-01

    RNA nanotechnology employs synthetically modified ribonucleic acid (RNA) to engineer highly stable nanostructures in one, two, and three dimensions for medical applications. Despite the tremendous advantages in RNA nanotechnology, unmodified RNA itself is fragile and prone to enzymatic degradation. In contrast to use traditionally modified RNA strands e.g. 2'-fluorine, 2'-amine, 2'-methyl, we studied the effect of RNA/DNA hybrid approach utilizing a computer-assisted RNA tetra-uracil (tetra-U) motif as a toolkit to address questions related to assembly efficiency, versatility, stability, and the production costs of hybrid RNA/DNA nanoparticles. The tetra-U RNA motif was implemented to construct four functional triangles using RNA, DNA and RNA/DNA mixtures, resulting in fine-tunable enzymatic and thermodynamic stabilities, immunostimulatory activity and RNAi capability. Moreover, the tetra-U toolkit has great potential in the fabrication of rectangular, pentagonal, and hexagonal NPs, representing the power of simplicity of RNA/DNA approach for RNA nanotechnology and nanomedicine community. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... such as rapidly constructing a compact formal model, computing the time point where the risk is maximum, or terminating the transient analysis after the fluctuations disappear and system stabilizes. Our toolkit, LBTool, is not only resolving the above-mentioned issues, but also demonstrating how to construct...... models in an analytical way and how to speed up the analysis by eliminating redundant computations. The toolkit can be generalized to other key update methods by replacing the analytical model construction....

  11. The PAX Toolkit and its Applications at Tevatron and LHC

    CERN Document Server

    Kappler, S; Felzmann, U; Hirschbuehl, D; Kirsch, M; Quast, G; Schmidt, A; Weng, J; Kappler, Steffen; Erdmann, Martin; Felzmann, Ulrich; Hirschbuehl, Dominic; Kirsch, Matthias; Quast, Guenter; Schmidt, Alexander; Weng, Joanna

    2006-01-01

    At the CHEP03 conference we launched the Physics Analysis eXpert (PAX), a C++ toolkit released for the use in advanced high energy physics (HEP) analyses. This toolkit allows to define a level of abstraction beyond detector reconstruction by providing a general, persistent container model for HEP events. Physics objects such as particles, vertices and collisions can easily be stored, accessed and manipulated. Bookkeeping of relations between these objects (like decay trees, vertex and collision separation, etc.) including deep copies is fully provided by the relation management. Event container and associated objects represent a uniform interface for algorithms and facilitate the parallel development and evaluation of different physics interpretations of individual events. So-called analysis factories, which actively identify and distinguish different physics processes and study systematic uncertainties, can easily be realized with the PAX toolkit. PAX is officially released to experiments at Tevatron and LHC...

  12. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  13. Deconstructing the toolkit: creativity and risk in the NHS workforce.

    Science.gov (United States)

    Allen, Von; Brodzinski, Emma

    2009-12-01

    Deconstructing the Toolkit explores the current desire for toolkits that promise failsafe structures to facilitate creative success. The paper examines this cultural phenomenon within the context of the risk-averse workplace-with particular focus on the NHS. The writers draw on Derrida and deconstructionism to reflect upon the principles of creativity and the possibilities for being creative within the workplace. Through reference to The Extra Mile project facilitated by Open Art, the paper examines the importance of engaging with an aesthetic of creativity and embracing a more holistic approach to the problems and potential of the creative process.

  14. The Early Astronomy Toolkit was Universal

    Science.gov (United States)

    Schaefer, Bradley E.

    2018-01-01

    From historical, anthropological, and archaeological records, we can reconstruct the general properties of the earliest astronomy for many cultures worldwide, and they all share many similar characteristics. The 'Early Astronomy Toolkit' (EAT) has the Earth being flat, and the heavens as a dome overhead populated by gods/heroes that rule Nature. The skies provided omens in a wide variety of manners, with eclipses, comets, and meteors always being evil and bad. Constellations were ubiquitous pictures of gods, heroes, animals, and everyday items; all for story telling. The calendars were all luni-solar, with no year counts and months only named by seasonal cues (including solstice observations and heliacal risings) with vague intercalation. Time of day came only from the sun's altitude/azimuth, while time at night came from star risings. Graves are oriented astronomically, and each culture has deep traditions of quartering the horizon. The most complicated astronomical tools were just a few sticks and stones. This is a higher level description and summary of the astronomy of all ancient cultures.This basic EAT was universal up until the Greeks, Mesopotamians, and Chinese broke out around 500 BC and afterwards. Outside the Eurasian milieu, with few exceptions (for example, planetary position measures in Mexico), this EAT represents astronomy for the rest of the world up until around 1600 AD. The EAT is present in these many cultures with virtually no variations or extensions. This universality must arise either from multiple independent inventions or by migration/diffusion. The probability of any culture independently inventing all 19 items in the EAT is low, but any such calculation has all the usual problems. Still, we realize that it is virtually impossible for many cultures to independently develop all 19 items in the EAT, so there must be a substantial fraction of migration of the early astronomical concepts. Further, the utter lack, as far as I know, of any

  15. A Toolkit of Systems Gaming Techniques

    Science.gov (United States)

    Finnigan, David; McCaughey, Jamie W.

    2017-04-01

    Decision-makers facing natural hazard crises need a broad set of cognitive tools to help them grapply with complexity. Systems gaming can act as a kind of 'flight simulator for decision making' enabling us to step through real life complex scenarios of the kind that beset us in natural disaster situations. Australian science-theatre ensemble Boho Interactive is collaborating with the Earth Observatory Singapore to develop an in-person systems game modelling an unfolding natural hazard crisis (volcanic unrest or an approaching typhoon) impacting an Asian city. Through a combination of interactive mechanisms drawn from boardgaming and participatory theatre, players will make decisions and assign resources in response to the unfolding crisis. In this performance, David Finnigan from Boho will illustrate some of the participatory techniques that Boho use to illustrate key concepts from complex systems science. These activities are part of a toolkit which can be adapted to fit a range of different contexts and scenarios. In this session, David will present short activities that demonstrate a range of systems principles including common-pool resource challenges (the Tragedy of the Commons), interconnectivity, unintended consequences, tipping points and phase transitions, and resilience. The interactive mechanisms for these games are all deliberately lo-fi rather than digital, for three reasons. First, the experience of a tactile, hands-on game is more immediate and engaging. It brings the focus of the participants into the room and facilitates engagement with the concepts and with each other, rather than with individual devices. Second, the mechanics of the game are laid bare. This is a valuable way to illustrate that complex systems are all around us, and are not merely the domain of hi-tech systems. Finally, these games can be used in a wide variety of contexts by removing computer hardware requirements and instead using materials and resources that are easily found in

  16. Scoping review of toolkits as a knowledge translation strategy in health.

    Science.gov (United States)

    Barac, Raluca; Stein, Sherry; Bruce, Beth; Barwick, Melanie

    2014-12-24

    Significant resources are invested in the production of research knowledge with the ultimate objective of integrating research evidence into practice. Toolkits are becoming increasingly popular as a knowledge translation (KT) strategy for disseminating health information, to build awareness, inform, and change public and healthcare provider behavior. Toolkits communicate messages aimed at improving health and changing practice to diverse audiences, including healthcare practitioners, patients, community and health organizations, and policy makers. This scoping review explores the use of toolkits in health and healthcare. Using Arksey and O'Malley's scoping review framework, health-based toolkits were identified through a search of electronic databases and grey literature for relevant articles and toolkits published between 2004 and 2011. Two reviewers independently extracted data on toolkit topic, format, target audience, content, evidence underlying toolkit content, and evaluation of the toolkit as a KT strategy. Among the 253 sources identified, 139 met initial inclusion criteria and 83 toolkits were included in the final sample. Fewer than half of the sources fully described the toolkit content and about 70% made some mention of the evidence underlying the content. Of 83 toolkits, only 31 (37%) had been evaluated at any level (27 toolkits were evaluated overall relative to their purpose or KT goal, and 4 toolkits evaluated the effectiveness of certain elements contained within them). Toolkits used to disseminate health knowledge or support practice change often do not specify the evidence base from which they draw, and their effectiveness as a knowledge translation strategy is rarely assessed. To truly inform health and healthcare, toolkits should include comprehensive descriptions of their content, be explicit regarding content that is evidence-based, and include an evaluation of the their effectiveness as a KT strategy, addressing both clinical and

  17. A toolkit to support postgraduate research supervisors in supervisory ...

    African Journals Online (AJOL)

    Aims. The aim of this integrative literature review was first to review current studies on the supervision process and the roles and responsibilities of the supervisor. The second aim was to use the findings to describe a 'supervisory toolkit' to enhance effective postgraduate research supervision. Background. Although ...

  18. Practical computational toolkits for dendrimers and dendrons structure design

    Science.gov (United States)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  19. Creating a Toolkit to Reduce Disparities in Patient Engagement.

    Science.gov (United States)

    Keddem, Shimrit; Agha, Aneeza Z; Long, Judith A; Werner, Rachel M; Shea, Judy A

    2017-09-01

    Patient engagement has become a major focus of health care improvement efforts nationally. Although evidence suggests patient engagement can be beneficial to patients, it has not been consistently defined, operationalized, or translated into practice. Our objective was to develop a toolkit to help providers increase patient engagement and reduce disparities in patient engagement. We used qualitative interviews and observations with staff at primary care sites nationally to identify patient engagement practices and resources used to engage patients. We then used a modified Delphi process, that included a series of conference calls and surveys, where stakeholders reduced lists of engagement practices based on perceived feasibility and importance to develop a toolkit for patient engagement. Sites were selected for interviews and site visits based on the concentration of minority patients served and performance on a measure of patient engagement, with the goal of highlighting practices at sites that successfully serve minority patients. We created a toolkit consisting of patient engagement practices and resources. No identified practice or resource specifically targeted patient engagement of minorities or addressed disparities. However, high-performing, high-minority-serving sites tended to describe more staff training opportunities and staff feedback mechanisms. In addition, low-performing and high-minority-serving sites more often reported barriers to implementation of patient engagement practices. Stakeholders agreed on feasible and important engagement practices. Implementation of this toolkit will be tracked to better understand patient engagement and its effect on patient-centered care and related disparities in care.

  20. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    Science.gov (United States)

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  1. A Toolkit to Implement Graduate Attributes in Geography Curricula

    Science.gov (United States)

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  2. Local Safety Toolkit: Enabling safe communities of opportunity

    CSIR Research Space (South Africa)

    Holtmann, B

    2010-08-31

    Full Text Available five years and draws from widely inclusive consultation and literature review. The toolkit aims to contribute to preventive approaches to address unsafety. Unsafety is a whole-government and whole-society problem. It is only through a multi...

  3. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  4. Practitioner Toolkit: Working with Adult English Language Learners.

    Science.gov (United States)

    Lieshoff, Sylvia Cobos; Aguilar, Noemi; McShane, Susan; Burt, Miriam; Peyton, Joy Kreeft; Terrill, Lynda; Van Duzer, Carol

    2004-01-01

    This document is designed to give support to adult education and family literacy instructors who are new to serving adult English language learners and their families in rural, urban, and faith- and community-based programs. The Toolkit is designed to have a positive impact on the teaching and learning in these programs. The results of two…

  5. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    Science.gov (United States)

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  6. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  7. Educating Globally Competent Citizens: A Toolkit. Second Edition

    Science.gov (United States)

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  8. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    Science.gov (United States)

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  9. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  10. Toolkit for healthcare facility design evaluation - some case studies

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available "? This is notoriously difficult to evaluate but, this paper argues, there would be much to be gained from a systematic, reliable and replicable framework for doing so. Internationally, some design evaluation toolkits specifically for healthcare facilities have been...

  11. Toolkit for healthcare facility design evaluation - some case studies.

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available "? This is notoriously difficult to evaluate but, this paper argues, there would be much to be gained from a systematic, reliable and replicable framework for doing so. Internationally, some design evaluation toolkits specifically for healthcare facilities have been...

  12. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Directory of Open Access Journals (Sweden)

    Rescigno R.

    2014-03-01

    Full Text Available Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  13. Automated Generation of Web Services for Visualization Toolkits

    Science.gov (United States)

    Jensen, P. A.; Yuen, D. A.; Erlebacher, G.; Bollig, E. F.; Kigelman, D. G.; Shukh, E. A.

    2005-12-01

    The recent explosion in the size and complexity of geophysical data and an increasing trend for collaboration across large geographical areas demand the use of remote, full featured visualization toolkits. As the scientific community shifts toward grid computing to handle these increased demands, new web services are needed to assemble powerful distributed applications. Recent research has established the possibility of converting toolkits such as VTK [1] and Matlab [2] into remote visualization services. We are investigating an automated system to allow these toolkits to export their functions as web services under the standardized protocols SOAP and WSDL using pre-existing software (gSOAP [3]) and a custom compiler for Tcl-based scripts. The compiler uses a flexible parser and type inferring mechanism to convert the Tcl into a C++ program that allows the desired Tcl procedures to be exported as SOAP-accessible functions and the VTK rendering window to be captured offscreen and encapsulated for forwarding through a web service. Classes for a client-side Java applet to access the rendering window remotely are also generated. We will use this system to demonstrate the streamlined generation of a standards-compliant web service (suitable for grid deployment) from a Tcl script for VTK. References: [1] The Visualization Toolkit, http://www.vtk.org [2] Matlab, http://www.mathworks.com [3] gSOAP, http://www.cs.fsu.edu/~engelen/soap.html

  14. Tactical Level Commander and Staff Toolkit

    Science.gov (United States)

    2010-01-01

    Natural Resources ESF Coordinator: Department of Agriculture • Nutrition assistance • Animal and plant disease and pest response • Food safety...media • Insisting on proper sleep, nutrition , and exercise among responders • Not making promises you cannot keep DSCA Handbook Tactical Level...or wild animals, including venomous snakes and rats. Personnel can become infested with lice and fleas. The danger from diseases such as rabies

  15. Economia funerària a la segona edat del ferro de les illes Balears. L’ús diacrític del sacrifici de bòvids en el santuari i les necròpolis de l’àrea de Son Real (Mallorca

    Directory of Open Access Journals (Sweden)

    Jordi Hernández-Gasch

    2011-10-01

    Full Text Available L’aparició d’un objecte excepcional fet d’os de bòvid anomenat tap en emplaçaments rituals de la segona edat del ferro va permetre fa uns anys postular una hipòtesi anomenada d’”economia funerària”. Les dades procedents d’assentaments suggereixen que la cabana ramadera va canviar en la seva composició, disminuint la quantitat de bòvids en favor dels ovicàprids. El perfil de sacrifici també va canviar orientant l’ús d’aquests animals domèstics cap a l’obtenció de productes secundaris i la força de treball. Les recents troballes de la necròpolis de Son Real i del Santuari de la Punta des Patró a Mallorca apunten vers un ritual de comensalitat, on presumiblement els sectors més rics de la societat haurien consumit carn de bòvid en els rituals funeraris o l’haurien emprada com a ofrena, en contraposició als grups menys rics que només haurien tingut accés a la carn de suid o ovicàprid.

  16. Phase 1 Development Report for the SESSA Toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Knowlton, Robert G.; Melton, Brad J; Anderson, Robert J.

    2014-09-01

    operation of th e SESSA tool kit in order to give the user enough information to start using the tool kit . SESSA is currently a prototype system and this documentation covers the initial release of the tool kit . Funding for SESSA was provided by the Department of Defense (D oD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL) . ACKNOWLEDGEMENTS The authors wish to acknowledge the funding support for the development of the Site Exploitation System for Situational Awareness (SESSA) toolkit from the Department of Defense (DoD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL). Special thanks to Mr. Garold Warner, of DFSC, who served as the Project Manager. Individuals that worked on the design, functional attributes, algorithm development, system arc hitecture, and software programming include: Robert Knowlton, Brad Melton, Robert Anderson, and Wendy Amai.

  17. ParCAT: A Parallel Climate Analysis Toolkit

    Science.gov (United States)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  18. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  19. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  20. Updates of IRC Imaging Toolkit and Data Archive

    Science.gov (United States)

    Egusa, Fumi; AKARI/IRC Team

    2017-03-01

    We have been working on data processing and calibration of AKARI/IRC images from pointed observations. As of September 2014, a data package for each pointing only contains raw data and quick-look data, so that users have to process them using the toolkit by themselves. We plan to change this situation and to provide science-ready data sets, which are easy-to-use for non-AKARI experts. For Phase 1&2, we have updated dark and flat calibrations, and also the toolkit itself to produce images more reliable and easier to use. A new data package includes fully calibrated images with WCS information. We released it for about 4000 pointings at the end of March 2015.

  1. Knowledge Server Toolkit: Perl-based Automation Tools

    OpenAIRE

    McLean, David

    1999-01-01

    A Perl Inference Engine (PIE), A Plan Executor (APE), and a Centroid Classifier (CCL) are described within the infrastructure provided by the Knowledge Server Toolkit (KST) environment. All these tools are implemented as Perl modules so that user's applications can easily use their capabilities. This technology was developed at NASA Goddard Space Flight Center under a task that was looking into languages and tools that are useful for automating satellite ground control systems. However, becau...

  2. A framework for a teaching toolkit in entrepreneurship education.

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  3. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    OpenAIRE

    Juan Mateu; María José Lasala; Xavier Alamán

    2015-01-01

    © 2015 by MDPI (http://www.mdpi.org). Reproduction is permitted for noncommercial purposes. In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, ab...

  4. Object Toolkit Version 4.3 User’s Manual

    Science.gov (United States)

    2016-12-31

    spacecraft. A single instrument can be an Object Toolkit object as can a combined solar array wing and ion thruster subsystem. Figure 1. Simple Spacecraft...is highly recommended that the user save often and use different filenames when doing so. This allows the user to back up to a simpler object and...transport over a wide expanse of such material. To ground the material, the user can specify grounding by a strip at an element edge or by a circular

  5. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  6. A framework for a teaching toolkit in entrepreneurship education

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a ‘learning-through-real-multimedia-entrepreneurial-narratives’ pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society. PMID:28680372

  7. Mobile-Assisted Vocabulary Learning: A Review Study

    Science.gov (United States)

    Afzali, Parichehr; Shabani, Somayeh; Basir, Zohreh; Ramazani, Mohammad

    2017-01-01

    Mobile phones are becoming more acceptable toolkits to learn languages. One aspect of English language which has been subject to investigation in mobile-assisted language learning (MALL) is vocabulary. This study reviewed some of the studies conducted in various contexts on the effect of MALL on vocabulary learning. We investigated some of the…

  8. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  9. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  11. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  12. Using the PhenX Toolkit to Add Standard Measures to a Study.

    Science.gov (United States)

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-07-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures. Copyright © 2015 John Wiley & Sons, Inc.

  13. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  14. When paradigms collide at the road rail interface: evaluation of a sociotechnical systems theory design toolkit for cognitive work analysis.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G

    2016-09-01

    The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.

  15. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  16. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  17. TECA: A Parallel Toolkit for Extreme Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  18. YAP. Yet another partial-wave-analysis toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Di Giglio, Paolo; Greenwald, Daniel; Rauch, Johannes [TUM, Munich (Germany)

    2016-07-01

    We present a new C++ library: YAP, the Yet Another Partial-wave-analysis toolkit. The library calculates amplitudes for multibody particle decays in several model frameworks. It is intended for the analysis of spin-0 heavy mesons, but is programmed with the flexibility to handle other decays. The library implements isobar decompositions, K-matrix formalism, and model-independent approaches for mass-dependent amplitudes; and both Wigner rotation and Zemach (for 3 particles) formalism for spin amplitudes. We introduce the software and give example use cases.

  19. A simulation toolkit for electroluminescence assessment in rare event experiments

    CERN Document Server

    Oliveira, C A B; Veenhof, R; Biagi, S; Monteiro, C M B; Santos, J M F dos; Ferreira, A L; Veloso, J F C A

    2011-01-01

    A good understanding of electroluminescence is a prerequisite when optimising double-phase noble gas detectors for Dark Matter searches and high-pressure xenon TPCs for neutrinoless double beta decay detection. A simulation toolkit for calculating the emission of light through electron impact on neon, argon, krypton and xenon has been developed using the Magboltz and Garfield programs. Calculated excitation and electroluminescence efficiencies, electroluminescence yield and associated statistical fluctuations are presented as a function of electric field. Good agreement with experiment and with Monte Carlo simulations has been obtained.

  20. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  1. A toolkit for encouraging activities in care homes.

    Science.gov (United States)

    Bishop, Karin

    Activity is vital for the physical and psychological wellbeing of care home residents. It should be an integral part of their daily routine but can be viewed as an additional burden for busy staff. Activity is defined as everything we "do", and even older people who are frail can still be active. Nurses need to consider how activity can be incorporated into residents' daily lives; the Living Well Through Activity in Care Homes toolkit, produced by the College of Occupational Therapists, aims to help staff provide meaningful activities for residents.

  2. Tips from the toolkit: 2--assessing organisational strengths.

    Science.gov (United States)

    Steer, Neville

    2010-03-01

    'SWOT' is a familiar term used in the development of business strategy. It is based on the identification of strengths, weaknesses, opportunities and threats as part of a strategic analysis approach. While there are a range of more sophisticated models for analysing and developing business strategy, it is a useful model for general practice as it is less time consuming than other approaches. The following article discusses some ways to apply this framework to assess organisational strengths (and weaknesses). It is based on The Royal Australian College of General Practitioners' "General practice management toolkit".

  3. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  4. A simulation toolkit for electroluminescence assessment in rare event experiments

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, C.A.B., E-mail: carlos.oliveira@ua.pt [I3N - Physics Department, University of Aveiro, 3810-193 Aveiro (Portugal); Schindler, H. [CERN, Geneva (Switzerland); Veenhof, R.J. [CERN RD-51 Collaboration (Switzerland); University of Wisconsin-Madison (United States); Biagi, S. [Physics Department, University of Liverpool, Liverpool (United Kingdom); Monteiro, C.M.B.; Santos, J.M.F. dos [GIAN, Physics Department, University of Coimbra, 3004-516 Coimbra (Portugal); Ferreira, A.L.; Veloso, J.F.C.A. [I3N - Physics Department, University of Aveiro, 3810-193 Aveiro (Portugal)

    2011-09-14

    A good understanding of electroluminescence is a prerequisite when optimising double-phase noble gas detectors for Dark Matter searches and high-pressure xenon TPCs for neutrinoless double beta decay detection. A simulation toolkit for calculating the emission of light through electron impact on neon, argon, krypton and xenon has been developed using the Magboltz and Garfield programs. Calculated excitation and electroluminescence efficiencies, electroluminescence yield and associated statistical fluctuations are presented as a function of electric field. Good agreement with experiment and with Monte Carlo simulations has been obtained.

  5. NMTPY: A Flexible Toolkit for Advanced Neural Machine Translation Systems

    Directory of Open Access Journals (Sweden)

    Caglayan Ozan

    2017-10-01

    Full Text Available In this paper, we present nmtpy, a flexible Python toolkit based on Theano for training Neural Machine Translation and other neural sequence-to-sequence architectures. nmtpy decouples the specification of a network from the training and inference utilities to simplify the addition of a new architecture and reduce the amount of boilerplate code to be written. nmtpy has been used for LIUM’s top-ranked submissions to WMT Multimodal Machine Translation and News Translation tasks in 2016 and 2017.

  6. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  7. VnCoreNLP: A Vietnamese Natural Language Processing Toolkit

    OpenAIRE

    Vu, Thanh; Nguyen, Dat Quoc; Nguyen, Dai Quoc; Dras, Mark; Johnson, Mark

    2018-01-01

    We present an easy-to-use and fast toolkit, namely VnCoreNLP---a Java NLP annotation pipeline for Vietnamese. Our VnCoreNLP supports key natural language processing (NLP) tasks including word segmentation, part-of-speech (POS) tagging, named entity recognition (NER) and dependency parsing, and obtains state-of-the-art (SOTA) results for these tasks. We release VnCoreNLP to provide rich linguistic annotations to facilitate research work on Vietnamese NLP. Our VnCoreNLP is open-source under GPL...

  8. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  9. Spread the word, not the germs: a toolkit for faith communities.

    Science.gov (United States)

    Reilly, Janet Resop; Hovarter, Rebecca; Mrochek, Tracy; Mittelstadt-Lock, Kay; Schmitz, Sue; Nett, Sue; Turner, Mary Jo; Moore, Ellen; Howden, Mary; Laabs, Cheryl; Behm, Linda

    2011-01-01

    A volunteer workgroup of public health personnel and parish nurses in Wisconsin collaborated to develop the Infection Control and Emergency Preparedness Toolkit for the Faith Community to help prepare congregations for health emergencies and prevent the spread of disease. This article reports a pilot study of the toolkit with 30 parishes/churches, focusing on the infection control portion of the materials.

  10. A software toolkit for web-based virtual environments based on a shared database

    NARCIS (Netherlands)

    van Schooten, B.W.; Nijholt, Antinus; van Dijk, Elisabeth M.A.G.; Isaias, P.T.; Karmakar, N.; Rodrigues, L.; Barbosa, P.

    2004-01-01

    We propose a software toolkit for developing complex web-based user interfaces, incorporating such things as multi-user facilities, virtual environments (VEs), and interface agents. The toolkit is based on a novel software architecture that combines ideas from multi-agent platforms and user

  11. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  12. Growing and Sustaining Parent Engagement: A Toolkit for Parents and Community Partners

    Science.gov (United States)

    Center for the Study of Social Policy, 2010

    2010-01-01

    The Toolkit is a quick and easy guide to help support and sustain parent engagement. It provides how to's for implementing three powerful strategies communities can use to maintain and grow parent engagement work that is already underway: Creating a Parent Engagement 1) Roadmap, 2) Checklist and 3) Support Network. This toolkit includes…

  13. Task Managers' ICT Toolkit : Good Practice for Planning, Delivering, and Sustaining ICT Products

    OpenAIRE

    World Bank

    2012-01-01

    The toolkit is made up of two parts. In part 1, a route map is aimed at raising awareness. It attempts to give direction by helping to categorize information and communication technologies (ICT) components in terms of complexity and walks through the different stages of preparation, implementation, and supervision. In part 2, the toolkit goes into greater detail. It is intended that this s...

  14. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Landfill Standards

    Science.gov (United States)

    2013-03-07

    ... environmental technologies that will outline U.S. approaches to a series of environmental problems and highlight... Toolkit will refer users in foreign markets to U.S. approaches to solving environmental problems and to U... International Trade Administration U.S. Environmental Solutions Toolkit--Landfill Standards AGENCY...

  15. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit.

    Science.gov (United States)

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-03-09

    Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  16. Regulatory and Permitting Information Desktop (RAPID) Toolkit (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Young, K. R.; Levine, A.

    2014-09-01

    The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.

  17. Clinical Trial of a Home Safety Toolkit for Alzheimer's Disease

    Science.gov (United States)

    Trudeau, Scott A.; Rudolph, James L.; Trudeau, Paulette A.; Duffy, Mary E.; Berlowitz, Dan

    2013-01-01

    This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n = 60) received the Home Safety Toolkit (HST), including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n = 48) received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA) was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P ≤ 0.001, caregiver strain at P ≤ 0.001, and caregiver self-efficacy at P = 0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P ≤ 0.001). The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT) or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care. PMID:24195007

  18. Veterinary Immunology Committee Toolkit Workshop 2010: progress and plans.

    Science.gov (United States)

    Entrican, Gary; Lunney, Joan K

    2012-07-15

    The 3rd Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the 9th International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on 18th August 2010. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialization and provision to the veterinary immunology research community. The emphasis was on open communication about current progress and future plans to avoid duplication of effort and to update priorities for reagent development. There were presentations on the major reagent development and networking projects such as the BBSRC/RERAD Immunological Toolbox (2004-2009), US Veterinary Immune Reagent Network (VIRN 2006-2010) that has just received renewal funding for 2010-2014, and EU Network for Animal Diseases Infectiology Research Facilities project (NADIR 2009-2013). There were also presentations and discussions on the use of reagents for assay development, particularly multiplexing, and how these new technologies will underpin basic research developments. Mechanisms for improved information exchange, especially though websites with VIC playing a central role, were identified. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. [Research on Three-dimensional Medical Image Reconstruction and Interaction Based on HTML5 and Visualization Toolkit].

    Science.gov (United States)

    Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang

    2015-04-01

    Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.

  20. Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing

    Science.gov (United States)

    Meng, X.

    2012-07-01

    Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.

  1. FIELD GROUND TRUTHING DATA COLLECTOR – A MOBILE TOOLKIT FOR IMAGE ANALYSIS AND PROCESSING

    Directory of Open Access Journals (Sweden)

    X. Meng

    2012-07-01

    Full Text Available Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1 Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use and health conditions of ecosystems and environments in the vicinity of the flight field; 2 Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3 Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.

  2. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  3. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  4. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  5. Demonstration of the Health Literacy Universal Precautions Toolkit: Lessons for Quality Improvement.

    Science.gov (United States)

    Mabachi, Natabhona M; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements.

  6. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  7. An expanded nuclear phylogenomic PCR toolkit for Sapindales1

    Science.gov (United States)

    Collins, Elizabeth S.; Gostel, Morgan R.; Weeks, Andrea

    2016-01-01

    Premise of the study: We tested PCR amplification of 91 low-copy nuclear gene loci in taxa from Sapindales using primers developed for Bursera simaruba (Burseraceae). Methods and Results: Cross-amplification of these markers among 10 taxa tested was related to their phylogenetic distance from B. simaruba. On average, each Sapindalean taxon yielded product for 53 gene regions (range: 16–90). Arabidopsis thaliana (Brassicales), by contrast, yielded product for two. Single representatives of Anacardiaceae and Rutacaeae yielded 34 and 26 products, respectively. Twenty-six primer pairs worked for all Burseraceae species tested if highly divergent Aucoumea klaineana is excluded, and eight of these amplified product in every Sapindalean taxon. Conclusions: Our study demonstrates that customized primers for Bursera can amplify product in a range of Sapindalean taxa. This collection of primer pairs, therefore, is a valuable addition to the toolkit for nuclear phylogenomic analyses of Sapindales and warrants further investigation. PMID:28101434

  8. An expanded nuclear phylogenomic PCR toolkit for Sapindales.

    Science.gov (United States)

    Collins, Elizabeth S; Gostel, Morgan R; Weeks, Andrea

    2016-12-01

    We tested PCR amplification of 91 low-copy nuclear gene loci in taxa from Sapindales using primers developed for Bursera simaruba (Burseraceae). Cross-amplification of these markers among 10 taxa tested was related to their phylogenetic distance from B. simaruba . On average, each Sapindalean taxon yielded product for 53 gene regions (range: 16-90). Arabidopsis thaliana (Brassicales), by contrast, yielded product for two. Single representatives of Anacardiaceae and Rutacaeae yielded 34 and 26 products, respectively. Twenty-six primer pairs worked for all Burseraceae species tested if highly divergent Aucoumea klaineana is excluded, and eight of these amplified product in every Sapindalean taxon. Our study demonstrates that customized primers for Bursera can amplify product in a range of Sapindalean taxa. This collection of primer pairs, therefore, is a valuable addition to the toolkit for nuclear phylogenomic analyses of Sapindales and warrants further investigation.

  9. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  10. The Bio-Community Perl toolkit for microbial ecology.

    Science.gov (United States)

    Angly, Florent E; Fields, Christopher J; Tyson, Gene W

    2014-07-01

    The development of bioinformatic solutions for microbial ecology in Perl is limited by the lack of modules to represent and manipulate microbial community profiles from amplicon and meta-omics studies. Here we introduce Bio-Community, an open-source, collaborative toolkit that extends BioPerl. Bio-Community interfaces with commonly used programs using various file formats, including BIOM, and provides operations such as rarefaction and taxonomic summaries. Bio-Community will help bioinformaticians to quickly piece together custom analysis pipelines and develop novel software. Availability an implementation: Bio-Community is cross-platform Perl code available from http://search.cpan.org/dist/Bio-Community under the Perl license. A readme file describes software installation and how to contribute. © The Author 2014. Published by Oxford University Press.

  11. PHISICS TOOLKIT: MULTI-REACTOR TRANSMUTATION ANALYSIS UTILITY - MRTAU

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati

    2012-04-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.

  12. The Standard European Vector Architecture (SEVA) plasmid toolkit.

    Science.gov (United States)

    Durante-Rodríguez, Gonzalo; de Lorenzo, Víctor; Martínez-García, Esteban

    2014-01-01

    The Standard European Vector Architecture (SEVA) toolkit is a simple and powerful resource for constructing optimal plasmid vectors based on a backbone and three interchangeable modules flanked by uncommon restriction sites. Functional modules encode several origins of replication, diverse antibiotic selection markers, and a variety of cargoes with different applications. The backbone and DNA modules have been minimized and edited for flaws in their sequence and/or functionality. A protocol for the utilization of the SEVA platform to construct transcriptional and translational fusions between a promoter under study (the arsenic-responsive Pars of Pseudomonas putida KT2440) and the reporter lacZ gene is described. The resulting plasmid collection was instrumental to measure and compare the β-galactosidase activity that report gene expression (i.e., transcription and translation) in different genetic backgrounds.

  13. The Exoplanet Characterization ToolKit (ExoCTK)

    Science.gov (United States)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  14. HemI: a toolkit for illustrating heatmaps.

    Directory of Open Access Journals (Sweden)

    Wankun Deng

    Full Text Available Recent high-throughput techniques have generated a flood of biological data in all aspects. The transformation and visualization of multi-dimensional and numerical gene or protein expression data in a single heatmap can provide a concise but comprehensive presentation of molecular dynamics under different conditions. In this work, we developed an easy-to-use tool named HemI (Heat map Illustrator, which can visualize either gene or protein expression data in heatmaps. Additionally, the heatmaps can be recolored, rescaled or rotated in a customized manner. In addition, HemI provides multiple clustering strategies for analyzing the data. Publication-quality figures can be exported directly. We propose that HemI can be a useful toolkit for conveniently visualizing and manipulating heatmaps. The stand-alone packages of HemI were implemented in Java and can be accessed at http://hemi.biocuckoo.org/down.php.

  15. A Modular Toolkit for Generating Pichia pastoris Secretion Libraries.

    Science.gov (United States)

    Obst, Ulrike; Lu, Timothy K; Sieber, Volker

    2017-06-16

    Yeasts are powerful eukaryotic hosts for the production of recombinant proteins due to their rapid growth to high cell densities and ease of genetic modification. For large-scale industrial production, secretion of a protein offers the advantage of simple and efficient downstream purification that avoids costly cell rupture, denaturation and refolding. The methylotrophic yeast Pichia pastoris (Komagataella phaffi) is a well-established expression host that has the ability to perform post-translational modifications and is generally regarded as safe (GRAS). Nevertheless, optimization of protein secretion in this host remains a challenge due to the multiple steps involved during secretion and a lack of genetic tools to tune this process. Here, we developed a toolkit of standardized regulatory elements specific for Pichia pastoris allowing the tuning of gene expression and choice of protein secretion tag. As protein secretion is a complex process, these parts are compatible with a hierarchical assembly method to enable the generation of large and diverse secretion libraries in order to explore a wide range of secretion constructs, achieve successful secretion, and better understand the regulatory factors of importance to specific proteins of interest. To assess the performance of these parts, we built and characterized the expression and secretion efficiency of 124 constructs that combined different regulatory elements with two fluorescent reporter proteins (RFP, yEGFP). Intracellular expression from our promoters was comparatively independent of whether RFP or yEGFP, and whether plasmid-based expression or genomically integrated expression, was used. In contrast, secretion efficiency significantly varied for different genes expressed using identical regulatory elements, with differences in secretion efficiency of >10-fold observed. These results highlight the importance of generating diverse secretion libraries when searching for optimal expression conditions, and

  16. svmPRAT: SVM-based Protein Residue Annotation Toolkit

    Directory of Open Access Journals (Sweden)

    Kauffman Christopher

    2009-12-01

    Full Text Available Abstract Background Over the last decade several prediction methods have been developed for determining the structural and functional properties of individual protein residues using sequence and sequence-derived information. Most of these methods are based on support vector machines as they provide accurate and generalizable prediction models. Results We present a general purpose protein residue annotation toolkit (svmPRAT to allow biologists to formulate residue-wise prediction problems. svmPRAT formulates the annotation problem as a classification or regression problem using support vector machines. One of the key features of svmPRAT is its ease of use in incorporating any user-provided information in the form of feature matrices. For every residue svmPRAT captures local information around the reside to create fixed length feature vectors. svmPRAT implements accurate and fast kernel functions, and also introduces a flexible window-based encoding scheme that accurately captures signals and pattern for training effective predictive models. Conclusions In this work we evaluate svmPRAT on several classification and regression problems including disorder prediction, residue-wise contact order estimation, DNA-binding site prediction, and local structure alphabet prediction. svmPRAT has also been used for the development of state-of-the-art transmembrane helix prediction method called TOPTMH, and secondary structure prediction method called YASSPP. This toolkit developed provides practitioners an efficient and easy-to-use tool for a wide variety of annotation problems. Availability: http://www.cs.gmu.edu/~mlbio/svmprat

  17. Detection of events of public health importance under the international health regulations: a toolkit to improve reporting of unusual events by frontline healthcare workers.

    Science.gov (United States)

    MacDonald, Emily; Aavitsland, Preben; Bitar, Dounia; Borgen, Katrine

    2011-09-21

    The International Health Regulations (IHR (2005)) require countries to notify WHO of any event which may constitute a public health emergency of international concern. This notification relies on reports of events occurring at the local level reaching the national public health authorities. By June 2012 WHO member states are expected to have implemented the capacity to "detect events involving disease or death above expected levels for the particular time and place" on the local level and report essential information to the appropriate level of public health authority. Our objective was to develop tools to assist European countries improve the reporting of unusual events of public health significance from frontline healthcare workers to public health authorities. We investigated obstacles and incentives to event reporting through a systematic literature review and expert consultations with national public health officials from various European countries. Multi-day expert meetings and qualitative interviews were used to gather experiences and examples of public health event reporting. Feedback on specific components of the toolkit was collected from healthcare workers and public health officials throughout the design process. Evidence from 79 scientific publications, two multi-day expert meetings and seven qualitative interviews stressed the need to clarify concepts and expectations around event reporting in European countries between the frontline and public health authorities. An analytical framework based on three priority areas for improved event reporting (professional engagement, communication and infrastructure) was developed and guided the development of the various tools. We developed a toolkit adaptable to country-specific needs that includes a guidance document for IHR National Focal Points and nine tool templates targeted at clinicians and laboratory staff: five awareness campaign tools, three education and training tools, and an implementation plan. The

  18. A Teacher Tablet Toolkit to meet the challenges posed by 21st ...

    African Journals Online (AJOL)

    Adele @

    2015-11-25

    Nov 25, 2015 ... challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents ... Keywords: classroom practice; gamification; mobile learning; teacher professional development; technology integration; toolkit ...... Towards a disruptive pedagogy: Changing ...

  19. ACES Model Composition and Development Toolkit to Support NGATS Concepts Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation proposed in this effort is the development of a model composition toolkit that will enable NASA Airspace Concept Evaluation System (ACES) users to...

  20. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  1. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  2. Model Analyst’s Toolkit User Guide, Version 7.1.0

    Science.gov (United States)

    2015-08-01

    might theorize that lowering interest rates increases the valuation of the stock market, a simple two-concept model . Models can be of virtually... Model Analyst’s Toolkit MAT User Guide Version 7 .1.0 charles river analytics Report Documentation Page Form ApprovedOMB No. 0704-0188 Public...2015 4. TITLE AND SUBTITLE Model Analyst’s Toolkit User Guide, Version 7.1.0 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  3. A Cas9-based toolkit to program gene expression in Saccharomyces cerevisiae

    OpenAIRE

    Apel, Amanda Reider; d'Espaux, Leo; Wehrs, Maren; Sachs, Daniel; Li, Rachel A.; Tong, Gary J.; Garber, Megan; Nnadi, Oge; Zhuang, William; Hillson, Nathan J.; Keasling, Jay D; Mukhopadhyay, Aindrila

    2017-01-01

    © 2016 The Author(s). Despite the extensive use of Saccharomyces cerevisiae as a platform for synthetic biology, strain engineering remains slow and laborious. Here, we employ CRISPR/Cas9 technology to build a cloning-free toolkit that addresses commonly encountered obstacles in metabolic engineering, including chromosomal integration locus and promoter selection, as well as protein localization and solubility. The toolkit includes 23 Cas9-sgRNA plasmids, 37 promoters of various strengths and...

  4. TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Knio, Omar [Johns Hopkins Univ., Baltimore, MD (United States)

    2011-05-01

    The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

  5. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  6. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  7. Graphics database creation and manipulation: HyperCard Graphics Database Toolkit and Apple Graphics Source

    Science.gov (United States)

    Herman, Jeffrey; Fry, David

    1990-08-01

    Because graphic files can be stored in a number ofdifferent file formats, it has traditionally been difficult to create a graphics database from which users can open, copy, and print graphic files, where each file in the database may be in one ofseverai different formats. HyperCard Graphics Database Toolkit has been designed and written by Apple Computer to enable software developers to facilitate the creation of customized graphics databases. Using a database developed with the toolkit, users can open, copy, or print a graphic transparently, without having to know or understand the complexities of file formats. In addition, the toolkit includes a graphic user interface, graphic design, on-line help, and search algorithms that enable users to locate specific graphics quickly. Currently, the toolkit handles graphics in the formats MPNT, PICT, and EPSF, and is open to supporting other formats as well. Developers can use the toolkit to alter the code, the interface, and the graphic design in order to customize their database for the needs oftheir users. This paper discusses the structure ofthe toolkit and one implementation, Apple Graphics Source (AGS). AGS contains over 2,000 graphics used in Apple's books and manuals. AGS enables users to find existing graphics of Apple products and use them for presentations, new publications, papers, and software projects.

  8. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  9. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    Science.gov (United States)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  10. Machine learning for a Toolkit for Image Mining

    Science.gov (United States)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  11. Toolkit for data reduction to tuples for the ATLAS experiment

    CERN Document Server

    Snyder, S; The ATLAS collaboration

    2012-01-01

    The final step in a HEP data-processing chain is usually to reduce the data to a `tuple' form which can be efficiently read by interactive analysis tools such as ROOT. Often, this is implemented independently by each group analyzing the data, leading to duplicated effort and needless divergence in the format of the reduced data. ATLAS has implemented a common toolkit for performing this processing step. By using tools from this package, physics analysis groups can produce tuples customized for a particular analysis but which are still consistent in format and vocabulary with those produced by other physics groups. The package is designed so that almost all the code is independent of the specific form used to store the tuple. The code that does depend on this is grouped into a set of small backend packages. While the ROOT backend is the most used, backends also exist for HDF5 and for specialized databases. By now, the majority of ATLAS analyses rely on this package, and it is an important contributor to the ab...

  12. Toolkit for data reduction to tuples for the ATLAS experiment

    Science.gov (United States)

    Snyder, Scott; Krasznahorkay, Attila

    2012-12-01

    The final step in a HEP data-processing chain is usually to reduce the data to a ‘tuple’ form which can be efficiently read by interactive analysis tools such as ROOT. Often, this is implemented independently by each group analyzing the data, leading to duplicated effort and needless divergence in the format of the reduced data. ATLAS has implemented a common toolkit for performing this processing step. By using tools from this package, physics analysis groups can produce tuples customized for a particular analysis but which are still consistent in format and vocabulary with those produced by other physics groups. The package is designed so that almost all the code is independent of the specific form used to store the tuple. The code that does depend on this is grouped into a set of small backend packages. While the ROOT backend is the most used, backends also exist for HDF5 and for specialized databases. By now, the majority of ATLAS analyses rely on this package, and it is an important contributor to the ability of ATLAS to rapidly analyze physics data.

  13. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  14. A Gateway MultiSite recombination cloning toolkit.

    Directory of Open Access Journals (Sweden)

    Lena K Petersen

    Full Text Available The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org.

  15. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  16. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  17. The NITE XML Toolkit: flexible annotation for multimodal language data.

    Science.gov (United States)

    Carletta, Jean; Evert, Stefan; Heid, Ulrich; Kilgour, Jonathan; Robertson, Judy; Voormann, Holger

    2003-08-01

    Multimodal corpora that show humans interacting via language are now relatively easy to collect. Current tools allow one either to apply sets of time-stamped codes to the data and consider their timing and sequencing or to describe some specific linguistic structure that is present in the data, built over the top of some form of transcription. To further our understanding of human communication, the research community needs code sets with both timings and structure, designed flexibly to address the research questions at hand. The NITE XML Toolkit offers library support that software developers can call upon when writing tools for such code sets and, thus, enables richer analyses than have previously been possible. It includes data handling, a query language containing both structural and temporal constructs, components that can be used to build graphical interfaces, sample programs that demonstrate how to use the libraries, a tool for running queries, and an experimental engine that builds interfaces on the basis of declarative specifications.

  18. A Gateway MultiSite recombination cloning toolkit.

    Science.gov (United States)

    Petersen, Lena K; Stowers, R Steven

    2011-01-01

    The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org).

  19. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  20. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    Science.gov (United States)

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  1. A Spiritual Care Toolkit: An evidence-based solution to meet spiritual needs.

    Science.gov (United States)

    Kincheloe, Donna D; Stallings Welden, Lois M; White, Ann

    2018-01-09

    To determine differences between baseline spiritual perspectives of nurses, patients and their families and examine the effectiveness of a spiritual care (SC) toolkit as an intervention to facilitate meeting spiritual needs of hospitalised patients and families. Provision of SC by nurses in the acute care environment is an issue of high priority for patients. Nurses report lack of time, comfort, training, cultural knowledge and mobilisation of resources as obstacles to SC delivery. Evidence points to positive patient outcomes and patient satisfaction, yet few studies include interventions to help nurses meet spiritual needs of patients and families. Descriptive and quasi-experimental design. Patients, family members (n = 132) and nurses (n = 54) were administered SC surveys while hospitalised on two acute care units of a Midwest hospital system in the United States. Population represented patients suffering acute, chronic and terminal illness. Data collected over a 13-week period examined relationships between the groups spiritual perspectives and the effectiveness of a SC toolkit intervention. Significant differences between nurse-patient and nurse-family groups were found, whereas no significant differences existed between patient-family groups. A pretest-posttest revealed the SC toolkit aided in overcoming obstacles to nurses' SC delivery. Patients and their family members found the SC toolkit helpful. Findings suggest an evidence-based SC toolkit has the propensity to help nurses meet spiritual needs of hospitalised patients and families. However, successful implementation and sustainability require organisational support, funding for resources and SC training for staff. A SC toolkit supplied with culturally sensitive faith resources supporting what patients and families value, believe and practice can be easily customised and implemented by any healthcare organisation in the world. Further investigation of SC toolkit effectiveness using multiple sites is

  2. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  3. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  4. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    Science.gov (United States)

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  6. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    Science.gov (United States)

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  7. Evaluating the Impacts of Media Assistance: Problems and Principles

    Directory of Open Access Journals (Sweden)

    Jessica Noske-Turner

    2015-01-01

    Full Text Available While some form of evaluation has always been a requirement of development projects, in the media assistance field this has predominantly been limited to very basic modes of counting outputs, such as the number of journalists trained or the number of articles produced on a topic. Few media assistance evaluations manage to provide sound evidence of impacts on governance and social change. So far, most responses to the problem of media assistance impact evaluation collate evaluation methodologies and methods into toolkits. This paper suggests that the problem of impact evaluation of media assistance is understood to be more than a simple issue of methods, and outlines three underlying tensions and challenges that stifle implementation of effective practices in media assistance evaluation. First, there are serious conceptual ambiguities that affect evaluation design. Second, bureaucratic systems and imperatives often drive evaluation practices, which reduces their utility and richness. Third, the search for the ultimate method or toolkit of methods for media assistance evaluation tends to overlook the complex epistemological and political undercurrents in the evaluation discipline, which can lead to methods being used without consideration of the ontological implications. Only if these contextual factors are known and understood can effective evaluations be designed that meets all stakeholders’ needs.

  8. Cal-Adapt: California's Climate Data Resource and Interactive Toolkit

    Science.gov (United States)

    Thomas, N.; Mukhtyar, S.; Wilhelm, S.; Galey, B.; Lehmer, E.

    2016-12-01

    Cal-Adapt is a web-based application that provides an interactive toolkit and information clearinghouse to help agencies, communities, local planners, resource managers, and the public understand climate change risks and impacts at the local level. The website offers interactive, visually compelling, and useful data visualization tools that show how climate change might affect California using downscaled continental climate data. Cal-Adapt is supporting California's Fourth Climate Change Assessment through providing access to the wealth of modeled and observed data and adaption-related information produced by California's scientific community. The site has been developed by UC Berkeley's Geospatial Innovation Facility (GIF) in collaboration with the California Energy Commission's (CEC) Research Program. The Cal-Adapt website allows decision makers, scientists and residents of California to turn research results and climate projections into effective adaptation decisions and policies. Since its release to the public in June 2011, Cal-Adapt has been visited by more than 94,000 unique visitors from over 180 countries, all 50 U.S. states, and 689 California localities. We will present several key visualizations that have been employed by Cal-Adapt's users to support their efforts to understand local impacts of climate change, indicate the breadth of data available, and delineate specific use cases. Recently, CEC and GIF have been developing and releasing Cal-Adapt 2.0, which includes updates and enhancements that are increasing its ease of use, information value, visualization tools, and data accessibility. We showcase how Cal-Adapt is evolving in response to feedback from a variety of sources to present finer-resolution downscaled data, and offer an open API that allows other organization to access Cal-Adapt climate data and build domain specific visualization and planning tools. Through a combination of locally relevant information, visualization tools, and access to

  9. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    Science.gov (United States)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and

  10. BioWarehouse: a bioinformatics database warehouse toolkit

    Directory of Open Access Journals (Sweden)

    Stringer-Calvert David WJ

    2006-03-01

    Full Text Available Abstract Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the

  11. BioWarehouse: a bioinformatics database warehouse toolkit

    Science.gov (United States)

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for

  12. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  13. MSAT—A new toolkit for the analysis of elastic and seismic anisotropy

    Science.gov (United States)

    Walker, Andrew M.; Wookey, James

    2012-12-01

    The design and content of MSAT, a new Matlab toolkit for the study and analysis of seismic and elastic anisotropy, is described. Along with a brief introduction to the basic theory of anisotropic elasticity and a guide to the functions provided by the toolkit, three example applications are discussed. First, the toolkit is used to analyse the effect of pressure on the elasticity of the monoclinic upper mantle mineral diopside. Second, the degree to which a model of elasticity in the lowermost mantle can be approximated by transverse isotropy is examined. Finally backazimuthal variation in the effective shear wave splitting caused by two anisotropic layers where the lower layer is dipping is calculated. MSAT can be freely reused for any purpose and the implementation of these and other examples are distributed with the source code.

  14. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  15. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  16. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  17. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  18. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  19. DUL Radio: A light-weight, wireless toolkit for sketching in hardware

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2011-01-01

    In this paper we present the first version of DUL Radio, a small, wireless toolkit for sketching sensor-based interaction. It is a concrete attempt to develop a platform that balances ease of use (learning, setup, initialization), size, speed, flexibility and cost, aimed at wearable and ultra......-mobile prototyping where fast reaction is needed (e.g. in controlling sound). The target audiences include designers, students, artists etc. with minimal programming and hardware skills. This presentation covers our motivations for creating the toolkit, specifications, test results, comparison to related products...

  20. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  1. A Cas9-based toolkit to program gene expression in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Apel, Amanda Reider; d'Espaux, Leo; Wehrs, Maren

    2017-01-01

    Despite the extensive use of Saccharomyces cere-visiae as a platform for synthetic biology, strain engineering remains slow and laborious. Here, we employ CRISPR/Cas9 technology to build a cloning-free toolkit that addresses commonly encountered obstacles in metabolic engineering, including...... chromosomal integration locus and promoter selection, as well as protein localization and solubility. The toolkit includes 23 Cas9-sgRNA plasmids, 37 promoters of various strengths and temporal expression profiles, and 10 protein-localization, degradation and solubility tags. We facilitated the use...

  2. A Toolkit for Forward/Inverse Problems in Electrocardiography within the SCIRun Problem Solving Environment

    Science.gov (United States)

    Burton, Brett M; Tate, Jess D; Erem, Burak; Swenson, Darrell J; Wang, Dafang F; Steffen, Michael; Brooks, Dana H; van Dam, Peter M; Macleod, Rob S

    2012-01-01

    Computational modeling in electrocardiography often requires the examination of cardiac forward and inverse problems in order to non-invasively analyze physiological events that are otherwise inaccessible or unethical to explore. The study of these models can be performed in the open-source SCIRun problem solving environment developed at the Center for Integrative Biomedical Computing (CIBC). A new toolkit within SCIRun provides researchers with essential frameworks for constructing and manipulating electrocardiographic forward and inverse models in a highly efficient and interactive way. The toolkit contains sample networks, tutorials and documentation which direct users through SCIRun-specific approaches in the assembly and execution of these specific problems. PMID:22254301

  3. BurnMan: A Lower Mantle Mineral Physics Toolkit

    Science.gov (United States)

    Unterborn, C. T.; Rose, I.; Heister, T.; Cottaar, S.

    2013-12-01

    Man - A Lower Mantle Mineral Physics Toolkit. In preparation, 2013. Cottaar, S., Heister, T., Rose, I., and Unterborn, C.: BurnMan. Technical Reference, www.burnman.org, 2013.

  4. A Toolkit for Dermal Risk Assessment: Toxicological Approach for Hazard Characterization

    NARCIS (Netherlands)

    Schuhmacher-Wolz, U.; Kalberlah, F.; Oppl, R.; Hemmen, J.J. van

    2003-01-01

    The toxicological background for hazard assessment using a simple to use toolkit for assessment and management of health risks from occupational dermal exposure is presented. Hazard assessment is intended to answer the following questions: (i) is the substance under consideration capable of damaging

  5. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  6. Making Schools the Model for Healthier Environments Toolkit: What It Is

    Science.gov (United States)

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  7. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Medical Waste

    Science.gov (United States)

    2013-03-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit--Medical Waste AGENCY: International... of medical waste. The Department of Commerce continues to develop the web-based U.S. Environmental... address, contact information, and medical waste management category of interest from the following list...

  8. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  9. Aspen in action: an Aspen Institute pilot tests a toolkit for transformation in public libraries

    National Research Council Canada - National Science Library

    Witteveen, April

    2016-01-01

    ... to take out to the community," Millsap tells LJ. As libraries engaged with the report, it became clear that many wanted more hands-on guidance about how to take recommendations from Rising to the Challenge and turn them into practical, achievable goals. In response, Aspen developed a new toolkit featuring 12 chapters of "ACTivities" covering topics su...

  10. Respiratory Protection Toolkit: Providing Guidance Without Changing Requirements-Can We Make an Impact?

    Science.gov (United States)

    Bien, Elizabeth Ann; Gillespie, Gordon Lee; Betcher, Cynthia Ann; Thrasher, Terri L; Mingerink, Donna R

    2016-12-01

    International travel and infectious respiratory illnesses worldwide place health care workers (HCWs) at increasing risk of respiratory exposures. To ensure the highest quality safety initiatives, one health care system used a quality improvement model of Plan-Do-Study-Act and guidance from Occupational Safety and Health Administration's (OSHA) May 2015 Hospital Respiratory Protection Program (RPP) Toolkit to assess a current program. The toolkit aided in identification of opportunities for improvement within their well-designed RPP. One opportunity was requiring respirator use during aerosol-generating procedures for specific infectious illnesses. Observation data demonstrated opportunities to mitigate controllable risks including strap placement, user seal check, and reuse of disposable N95 filtering facepiece respirators. Subsequent interdisciplinary collaboration resulted in other ideas to decrease risks and increase protection from potentially infectious respiratory illnesses. The toolkit's comprehensive document to evaluate the program showed that while the OSHA standards have not changed, the addition of the toolkit can better protect HCWs. © 2016 The Author(s).

  11. 78 FR 14774 - U.S. Environmental Solutions Toolkit-Universal Waste

    Science.gov (United States)

    2013-03-07

    ...: (a) Mercury Recycling Technology (b) E-Waste Recycling Technology (c) CRT Recycling Technology (d... International Trade Administration U.S. Environmental Solutions Toolkit--Universal Waste AGENCY: International... of universal waste. The Department of Commerce continues to develop the web-based U.S. Environmental...

  12. Organizational Context Matters: A Research Toolkit for Conducting Standardized Case Studies of Integrated Care Initiatives

    Directory of Open Access Journals (Sweden)

    Jenna M. Evans

    2017-06-01

    Full Text Available Introduction: The variable success of integrated care initiatives has led experts to recommend tailoring design and implementation to the organizational context. Yet, organizational contexts are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. We thus lack knowledge of when and specifically how organizational contexts matter. To facilitate the accumulation of evidence, we developed a research toolkit for conducting case studies using standardized measures of the (inter-organizational context for integrating care.  Theory and Methods: We used a multi-method approach to develop the research toolkit: (1 development and validation of the Context and Capabilities for Integrating Care (CCIC Framework, (2 identification, assessment, and selection of survey instruments, (3 development of document review methods, (4 development of interview guide resources, and (5 pilot testing of the document review guidelines, consolidated survey, and interview guide.  Results: The toolkit provides a framework and measurement tools that examine 18 organizational and inter-organizational factors that affect the implementation and success of integrated care initiatives.  Discussion and Conclusion: The toolkit can be used to characterize and compare organizational contexts across cases and enable comparison of results across studies. This information can enhance our understanding of the influence of organizational contexts, support the transfer of best practices, and help explain why some integrated care initiatives succeed and some fail.

  13. The Knowledge Translation Toolkit: Bridging the Know–Do Gap: A ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-06-06

    Jun 6, 2011 ... The book covers an array of crucial KT enablers — from context mapping to evaluative thinking — supported by practical examples, implementation guides, and references. Drawing from the experience of specialists in relevant disciplines around the world, The Knowledge Translation Toolkit aims to ...

  14. How To Create a Community Guide to Your School District's Budget. School Finance Toolkit.

    Science.gov (United States)

    Hassel, Bryan C.

    This toolkit helps community-based organizations create a community guide to the school budget, demystifying school finance for citizens and engaging them in the process of using the school budget as a tool for school improvement. It explains the major steps organizations have used in their own initiatives, offering advice and examples of tools.…

  15. The Places Toolkit for the Impact Assessment of Science Communication Initiatives and Policies

    DEFF Research Database (Denmark)

    Ravn, Tine; Mejlgaard, Niels

    2012-01-01

    This document has been created for the purpose of serving as an instrument for the measure of the impact of initiatives and policies within the area of science communication and scientific culture in general (SCIP: Science Communication Initiatives and Policies). The toolkit is part of the European...... project PLACES (Platform of Local Authorities and Communicators Engaged in Science)....

  16. University of Central Florida and the American Association of State Colleges and Universities: Blended Learning Toolkit

    Science.gov (United States)

    EDUCAUSE, 2014

    2014-01-01

    The Blended Learning Toolkit supports the course redesign approach, and interest in its openly available clearinghouse of online tools, strategies, curricula, and other materials to support the adoption of blended learning continues to grow. When the resource originally launched in July 2011, 20 AASCU [American Association of State Colleges and…

  17. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. The MOLGENIS toolkit : rapid prototyping of biosoftware at the push of a button

    NARCIS (Netherlands)

    Swertz, Morris A.; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K.; Kanterakis, Alexandros; Roos, Erik T.; Lops, Joris; Thorisson, Gudmundur A.; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J.; de Brock, Engbert O.; Jansen, Ritsert C.; Parkinson, Helen

    2010-01-01

    Background: There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly

  19. School Turnaround Leaders: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    Science.gov (United States)

    Public Impact, 2008

    2008-01-01

    This toolkit includes the following separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides a list of competencies that would…

  20. School Turnaround Teachers: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    Science.gov (United States)

    Public Impact, 2008

    2008-01-01

    This toolkit includes these separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides turnaround teacher competencies that are the…

  1. The Sense-It App: A Smartphone Sensor Toolkit for Citizen Inquiry Learning

    Science.gov (United States)

    Sharples, Mike; Aristeidou, Maria; Villasclaras-Fernández, Eloy; Herodotou, Christothea; Scanlon, Eileen

    2017-01-01

    The authors describe the design and formative evaluation of a sensor toolkit for Android smartphones and tablets that supports inquiry-based science learning. The Sense-it app enables a user to access all the motion, environmental and position sensors available on a device, linking these to a website for shared crowd-sourced investigations. The…

  2. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  3. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    Science.gov (United States)

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…

  4. ISO/IEEE 11073 PHD message generation toolkit to standardize healthcare device.

    Science.gov (United States)

    Lim, Joon-Ho; Park, Chanyong; Park, Soo-Jun; Lee, Kyu-Chul

    2011-01-01

    As senior population increases, various healthcare devices and services are developed such as fall detection device, home hypertension management service, and etc. However, to vitalize healthcare devices and services market, standardization for interoperability between device and service must precede. To achieve the standardization goal, the IEEE 11073 Personal Health Device (PHD) group has been standardized many healthcare devices, but until now there are few devices compatible with the PHD standard. One of main reasons is that it isn't easy for device manufactures to implement standard communication module by analyzing standard documents of over 600 pages. In this paper, we propose a standard message generation toolkit to easily standardize existing non-standard healthcare devices. The proposed toolkit generates standard PHD messages using inputted device information, and the generated messages are adapted to the device with the standard state machine file. For the experiments, we develop a reference H/W, and test the proposed toolkit with three healthcare devices: blood pressure, weighting scale, and glucose meter. The proposed toolkit has an advantage that even if the user doesn't know the standard in detail, the user can easily standardize the non-standard healthcare devices.

  5. Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications

    NARCIS (Netherlands)

    Gonsalves, Atish; Ternier, Stefaan; De Vries, Fred; Specht, Marcus

    2012-01-01

    Gonsalves, A., Ternier, S., De Vries, F., & Specht, M. (2012). Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications. In M. Specht, M. Sharples, & J. Multisilta (Eds.), Proceedings of 11th World Conference on Mobile and Contextual Learning (mLearn 2012) (pp.

  6. Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications

    NARCIS (Netherlands)

    Gonsalves, Atish; Ternier, Stefaan; De Vries, Fred; Specht, Marcus

    2013-01-01

    Gonsalves, A., Ternier, S., De Vries, F., & Specht, M. (2012, 16-18 October). Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications. Presentation given at the 11th World Conference on Mobile and Contextual Learning (mLearn 2012), Helsinki, Finland.

  7. Development of the Mississippi communities for healthy living nutrition education toolkit

    Science.gov (United States)

    The objective of our study was to develop a nutrition education toolkit for communities in the Lower Mississippi Delta (LMD) with content that is current, evidence-based, culturally relevant, and user friendly. The Mississippi Communities for Fealthy Living (MCHL), an evidenced-based nutrition educa...

  8. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  9. Assessing the effectiveness of the Pesticides and Farmworker Health Toolkit: a curriculum for enhancing farmworkers' understanding of pesticide safety concepts.

    Science.gov (United States)

    LePrevost, Catherine E; Storm, Julia F; Asuaje, Cesar R; Arellano, Consuelo; Cope, W Gregory

    2014-01-01

    Among agricultural workers, migrant and seasonal farmworkers have been recognized as a special risk population because these laborers encounter cultural challenges and linguistic barriers while attempting to maintain their safety and health within their working environments. The crop-specific Pesticides and Farmworker Health Toolkit (Toolkit) is a pesticide safety and health curriculum designed to communicate to farmworkers pesticide hazards commonly found in their working environments and to address Worker Protection Standard (WPS) pesticide training criteria for agricultural workers. The goal of this preliminary study was to test evaluation items for measuring knowledge increases among farmworkers and to assess the effectiveness of the Toolkit in improving farmworkers' knowledge of key WPS and risk communication concepts when the Toolkit lesson was delivered by trained trainers in the field. After receiving training on the curriculum, four participating trainers provided lessons using the Toolkit as part of their regular training responsibilities and orally administered a pre- and post-lesson evaluation instrument to 20 farmworker volunteers who were generally representative of the national farmworker population. Farmworker knowledge of pesticide safety messages significantly (PPesticides and Farmworker Health Toolkit is an effective, research-based pesticide safety and health intervention for the at-risk farmworker population and identifies a testing format appropriate for evaluating the Toolkit and other similar interventions for farmworkers in the field.

  10. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    Science.gov (United States)

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; Passessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  11. Dental Assistants

    Science.gov (United States)

    ... State & Area Data Explore resources for employment and wages by state and area for dental assistants. Similar Occupations Compare the job duties, education, job growth, and pay of dental assistants with ...

  12. Assistive Technology

    Science.gov (United States)

    ... Page Resize Text Printer Friendly Online Chat Assistive Technology Assistive technology (AT) is any service or tool that helps ... be difficult or impossible. For older adults, such technology may be a walker to improve mobility or ...

  13. Assistive Technologies

    Science.gov (United States)

    Auat Cheein, Fernando A., Ed.

    2012-01-01

    This book offers the reader new achievements within the Assistive Technology field made by worldwide experts, covering aspects such as assistive technology focused on teaching and education, mobility, communication and social interactivity, among others. Each chapter included in this book covers one particular aspect of Assistive Technology that…

  14. Housing Assistance

    Directory of Open Access Journals (Sweden)

    Emma Baker

    2013-07-01

    Full Text Available In Australia, an increasing number of households face problems of access to suitable housing in the private market. In response, the Federal and State Governments share responsibility for providing housing assistance to these, mainly low-income, households. A broad range of policy instruments are used to provide and maintain housing assistance across all housing tenures, for example, assisting entry into homeownership, providing affordability assistance in the private rental market, and the provision of socially owned and managed housing options. Underlying each of these interventions is the premise that secure, affordable, and appropriate housing provides not only shelter but also a number of nonshelter benefits to individuals and their households. Although the nonshelter outcomes of housing are well acknowledged in Australia, the understanding of the nonshelter outcomes of housing assistance is less clear. This paper explores nonshelter outcomes of three of the major forms of housing assistance provided by Australian governments—low-income mortgage assistance, social housing, and private rent assistance. It is based upon analysis of a survey of 1,353 low-income recipients of housing assistance, and specifically measures the formulation of health and well-being, financial stress, and housing satisfaction outcomes across these three assistance types. We find clear evidence that health, finance, and housing satisfaction outcomes are associated with quite different factors for individuals in these three major housing assistance types.

  15. A GIS Software Toolkit for Converting NASA HDF-EOS Data Products to GIS and Other Geospatial Formats Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for easily converting NASA HDF-EOS data into formats that can be readily used within a Geographic...

  16. ViewBS: a powerful toolkit for visualization of high-throughput bisulfite sequencing data.

    Science.gov (United States)

    Huang, Xiaosan; Zhang, Shaoling; Li, Kongqing; Thimmapuram, Jyothi; Xie, Shaojun

    2017-10-26

    High throughput bisulfite sequencing (BS-seq) is an important technology to generate single-base DNA methylomes in both plants and animals. In order to accelerate the data analysis of BS-seq data, toolkits for visualization are required. ViewBS, an open-source toolkit, can extract and visualize the DNA methylome data easily and with flexibility. By using Tabix, ViewBS can visualize BS-seq for large datasets quickly. ViewBS can generate publication-quality figures, such as meta-plots, heat maps and violin-boxplots, which can help users to answer biological questions. We illustrate its application using BS-seq data from Arabidopsis thaliana. ViewBS is freely available at: https://github.com/xie186/ViewBS. xie186@purdue.edu. Supplementary data are available at Bioinformatics online.

  17. The semantic measures library and toolkit: fast computation of semantic similarity and relatedness using biomedical ontologies.

    Science.gov (United States)

    Harispe, Sébastien; Ranwez, Sylvie; Janaqi, Stefan; Montmain, Jacky

    2014-03-01

    The semantic measures library and toolkit are robust open-source and easy to use software solutions dedicated to semantic measures. They can be used for large-scale computations and analyses of semantic similarities between terms/concepts defined in terminologies and ontologies. The comparison of entities (e.g. genes) annotated by concepts is also supported. A large collection of measures is available. Not limited to a specific application context, the library and the toolkit can be used with various controlled vocabularies and ontology specifications (e.g. Open Biomedical Ontology, Resource Description Framework). The project targets both designers and practitioners of semantic measures providing a JAVA library, as well as a command-line tool that can be used on personal computers or computer clusters. Downloads, documentation, tutorials, evaluation and support are available at http://www.semantic-measures-library.org.

  18. Experiences in the Gridification of the Geant4 Toolkit in the WLCG/EGEE Environment

    CERN Document Server

    Mendez-Lorenzo, R; Ribon, A

    2007-01-01

    The general patterns observed in supporting the Geant4 application in the EGEE infrastructure are discussed. Regression testing of Geant4 public releases is in the focus of this paper. Geant4 is a toolkit for the Monte Carlo simulation of the interaction of particle with matter, used by a wide field of research, including high energy and nuclear physics and also medical, accelerator and space physics studies. The support required for the release regression testing of Geant4 toolkit, including setting up of the new, official Virtual Organization in the EGEE, is explained. Recent developments of automatic regression testing suites and the benefits of the optimization layer above the standard Grid infrastructure are presented.

  19. Creation of an SWMM Toolkit for Its Application in Urban Drainage Networks Optimization

    Directory of Open Access Journals (Sweden)

    F. Javier Martínez-Solano

    2016-06-01

    Full Text Available The Storm Water Management Model (SWMM is a dynamic simulation engine of flow in sewer systems developed by the USEPA. It has been successfully used for analyzing and designing both storm water and waste water systems. However, despite including some interfacing functions, these functions are insufficient for certain simulations. This paper describes some new functions that have been added to the existing ones to form a library of functions (Toolkit. The Toolkit presented here will allow the direct modification of network data during simulation without the need to access the input file. To support the use of this library, a testing protocol was performed in order to evaluate both calculation time and accuracy of results. Finally, a case study is presented. In this application, this library will be used for the design of a sewerage network by using a genetic algorithm based on successive iterations.

  20. Measuring the Environmental Dimensions of Human Migration: The Demographer’s Toolkit

    Science.gov (United States)

    Hunter, Lori M.; Gray, Clark L.

    2014-01-01

    In recent years, the empirical literature linking environmental factors and human migration has grown rapidly and gained increasing visibility among scholars and the policy community. Still, this body of research uses a wide range of methodological approaches for assessing environment-migration relationships. Without comparable data and measures across a range of contexts, it is impossible to make generalizations that would facilitate the development of future migration scenarios. Demographic researchers have a large methodological toolkit for measuring migration as well as modeling its drivers. This toolkit includes population censuses, household surveys, survival analysis and multi-level modeling. This paper’s purpose is to introduce climate change researchers to demographic data and methods and to review exemplary studies of the environmental dimensions of human migration. Our intention is to foster interdisciplinary understanding and scholarship, and to promote high quality research on environment and migration that will lead toward broader knowledge of this association. PMID:25177108

  1. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  2. TMVA(Toolkit for Multivariate Analysis) new architectures design and implementation.

    CERN Document Server

    Zapata Mesa, Omar Andres

    2016-01-01

    Toolkit for Multivariate Analysis(TMVA) is a package in ROOT for machine learning algorithms for classification and regression of the events in the detectors. In TMVA, we are developing new high level algorithms to perform multivariate analysis as cross validation, hyper parameter optimization, variable importance etc... Almost all the algorithms are expensive and designed to process a huge amount of data. It is very important to implement the new technologies on parallel computing to reduce the processing times.

  3. Mantis a framework and toolkit for Geant4-based simulation in CMS

    CERN Document Server

    Stavrianakou, Maya

    2002-01-01

    Mantis is a framework and toolkit for Geant4-based interactive or batch simulation using the CMS OO architecture implemented in the COBRA project. Mantis provides the infrastructure for the selection, implementation, configuration and tuning of all essential simulation elements: detector geometry, sensitive detectors and hits, generators and physics, magnetic field, visualisation, run and event management, and user monitoring actions. Persistency and other important services are available using the standard COBRA infrastructure and are transparent to user applications.

  4. The UK Earth System Models Marine Biogeochemical Evaluation Toolkit, BGC-val

    Science.gov (United States)

    de Mora, Lee

    2017-04-01

    The Biogeochemical Validation toolkit, BGC-val, is a model and grid independent python-based marine model evaluation framework that automates much of the validation of the marine component of an Earth System Model. BGC-val was initially developed to be a flexible and extensible system to evaluate the spin up of the marine UK Earth System Model (UKESM). However, the grid-independence and flexibility means that it is straightforward to adapt the BGC-val framework to evaluate other marine models. In addition to the marine component of the UKESM, this toolkit has been adapted to compare multiple models, including models from the CMIP5 and iMarNet inter-comparison projects. The BGC-val toolkit produces multiple levels of analysis which are presented in a simple to use interactive html5 document. Level 1 contains time series analyses, showing the development over time of many important biogeochemical and physical ocean metrics, such as the Global primary production or the Drake passage current. The second level of BGC-val is an in-depth spatial analyses of a single point in time. This is a series of point to point comparison of model and data in various regions, such as a comparison of Surface Nitrate in the model vs data from the world ocean atlas. The third level analyses are specialised ad-hoc packages to go in-depth on a specific question, such as the development of Oxygen minimum zones in the Equatorial Pacific. In additional to the three levels, the html5 document opens with a Level 0 table showing a summary of the status of the model run. The beta version of this toolkit is available via the Plymouth Marine Laboratory Gitlab server and uses the BSD 3 clause license.

  5. Developing an alcohol policy assessment toolkit: application in the western Pacific

    OpenAIRE

    Carragher, Natacha; Byrnes, Joshua; Doran, Christopher M; Shakeshaft, Anthony

    2014-01-01

    Abstract Objective To demonstrate the development and feasibility of a tool to assess the adequacy of national policies aimed at reducing alcohol consumption and related problems. Methods We developed a quantitative tool – the Toolkit for Evaluating Alcohol policy Stringency and Enforcement (TEASE-16) – to assess the level of stringency and enforcement of 16 alcohol control policies. TEASE-16 was applied to policy data from nine study areas in the western Pacific: Australia, China excluding H...

  6. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data

    OpenAIRE

    Althammer, Sonja Daniela; González-Vallinas Rostes, Juan, 1983-; Ballaré, Cecilia Julia; Beato, Miguel; Eyras Jiménez, Eduardo

    2011-01-01

    Motivation: High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein?DNA and protein?RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. Results: We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or b...

  7. ChemDataExtractor: A Toolkit for Automated Extraction of Chemical Information from the Scientific Literature.

    OpenAIRE

    Swain, Matthew C; Cole, Jacqueline Manina

    2016-01-01

    The emergence of "big data" initiatives has led to the need for tools that can automatically extract valuable chemical information from large volumes of unstructured data, such as the scientific literature. Since chemical information can be present in figures, tables, and textual paragraphs, successful information extraction often depends on the ability to interpret all of these domains simultaneously. We present a complete toolkit for the automated extraction of chemical entities and their a...

  8. Vi-XFST : a visual interface for xerox finite-state toolkit

    OpenAIRE

    Yılmaz, Yasin; Yilmaz, Yasin

    2003-01-01

    This thesis presents a management model and integrated development environment software for finite-state network projects using Xerox Finite-State Toolkit (XFST). XFST is a popular command line tool to construct finite-states networks, used in natural language processing research. However, XFST lacks various sophisticated management features to help the development phase of large projects where there are hundreds of finite-state definitions. In this thesis, we introduce a new approach to XFST...

  9. WAVOS: a MATLAB toolkit for wavelet analysis and visualization of oscillatory systems

    Directory of Open Access Journals (Sweden)

    Harang Richard

    2012-03-01

    Full Text Available Abstract Background Wavelets have proven to be a powerful technique for the analysis of periodic data, such as those that arise in the analysis of circadian oscillators. While many implementations of both continuous and discrete wavelet transforms are available, we are aware of no software that has been designed with the nontechnical end-user in mind. By developing a toolkit that makes these analyses accessible to end users without significant programming experience, we hope to promote the more widespread use of wavelet analysis. Findings We have developed the WAVOS toolkit for wavelet analysis and visualization of oscillatory systems. WAVOS features both the continuous (Morlet and discrete (Daubechies wavelet transforms, with a simple, user-friendly graphical user interface within MATLAB. The interface allows for data to be imported from a number of standard file formats, visualized, processed and analyzed, and exported without use of the command line. Our work has been motivated by the challenges of circadian data, thus default settings appropriate to the analysis of such data have been pre-selected in order to minimize the need for fine-tuning. The toolkit is flexible enough to deal with a wide range of oscillatory signals, however, and may be used in more general contexts. Conclusions We have presented WAVOS: a comprehensive wavelet-based MATLAB toolkit that allows for easy visualization, exploration, and analysis of oscillatory data. WAVOS includes both the Morlet continuous wavelet transform and the Daubechies discrete wavelet transform. We have illustrated the use of WAVOS, and demonstrated its utility for the analysis of circadian data on both bioluminesence and wheel-running data. WAVOS is freely available at http://sourceforge.net/projects/wavos/files/

  10. PlantGSEA: a gene set enrichment analysis toolkit for plant community

    OpenAIRE

    Yi, Xin; Du, Zhou; Su, Zhen

    2013-01-01

    Gene Set Enrichment Analysis (GSEA) is a powerful method for interpreting biological meaning of a list of genes by computing the overlaps with various previously defined gene sets. As one of the most widely used annotations for defining gene sets, Gene Ontology (GO) system has been used in many enrichment analysis tools. EasyGO and agriGO, two GO enrichment analysis toolkits developed by our laboratory, have gained extensive usage and citations since their releases because of their effective ...

  11. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    Science.gov (United States)

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  12. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    Science.gov (United States)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  13. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  14. Midwives in medical student and resident education and the development of the medical education caucus toolkit.

    Science.gov (United States)

    Radoff, Kari; Nacht, Amy; Natch, Amy; McConaughey, Edie; Salstrom, Jan; Schelling, Karen; Seger, Suzanne

    2015-01-01

    Midwives have been involved formally and informally in the training of medical students and residents for many years. Recent reductions in resident work hours, emphasis on collaborative practice, and a focus on midwives as key members of the maternity care model have increased the involvement of midwives in medical education. Midwives work in academic settings as educators to teach the midwifery model of care, collaboration, teamwork, and professionalism to medical students and residents. In 2009, members of the American College of Nurse-Midwives formed the Medical Education Caucus (MECA) to discuss the needs of midwives teaching medical students and residents; the group has held a workshop annually over the last 4 years. In 2014, MECA workshop facilitators developed a toolkit to support and formalize the role of midwives involved in medical student and resident education. The MECA toolkit provides a roadmap for midwives beginning involvement and continuing or expanding the role of midwives in medical education. This article describes the history of midwives in medical education, the development and growth of MECA, and the resulting toolkit created to support and formalize the role of midwives as educators in medical student and resident education, as well as common challenges for the midwife in academic medicine. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. © 2015 by the American College of Nurse-Midwives.

  15. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  16. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  17. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  18. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Bowd

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  19. The Development and Evaluation of an Online Healthcare Toolkit for Autistic Adults and their Primary Care Providers.

    Science.gov (United States)

    Nicolaidis, Christina; Raymaker, Dora; McDonald, Katherine; Kapp, Steven; Weiner, Michael; Ashkenazy, Elesia; Gerrity, Martha; Kripke, Clarissa; Platt, Laura; Baggs, Amelia

    2016-10-01

    The healthcare system is ill-equipped to meet the needs of adults on the autism spectrum. Our goal was to use a community-based participatory research (CBPR) approach to develop and evaluate tools to facilitate the primary healthcare of autistic adults. Toolkit development included cognitive interviewing and test-retest reliability studies. Evaluation consisted of a mixed-methods, single-arm pre/post-intervention comparison. A total of 259 autistic adults and 51 primary care providers (PCPs) residing in the United States. The AASPIRE Healthcare toolkit includes the Autism Healthcare Accommodations Tool (AHAT)-a tool that allows patients to create a personalized accommodations report for their PCP-and general healthcare- and autism-related information, worksheets, checklists, and resources for patients and healthcare providers. Satisfaction with patient-provider communication, healthcare self-efficacy, barriers to healthcare, and satisfaction with the toolkit's usability and utility; responses to open-ended questions. Preliminary testing of the AHAT demonstrated strong content validity and adequate test-retest stability. Almost all patient participants (>94 %) felt that the AHAT and the toolkit were easy to use, important, and useful. In pre/post-intervention comparisons, the mean number of barriers decreased (from 4.07 to 2.82, p communication improved (from 30.9 to 32.6, p = 0.03). Patients stated that the toolkit helped clarify their needs, enabled them to self-advocate and prepare for visits more effectively, and positively influenced provider behavior. Most of the PCPs surveyed read the AHAT (97 %), rated it as moderately or very useful (82 %), and would recommend it to other patients (87 %). The CBPR process resulted in a reliable healthcare accommodation tool and a highly accessible healthcare toolkit. Patients and providers indicated that the tools positively impacted healthcare interactions. The toolkit has the potential to reduce barriers to

  20. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Directory of Open Access Journals (Sweden)

    K Anderson

    Full Text Available This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app', so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping.

  1. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    Science.gov (United States)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  2. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    Science.gov (United States)

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  3. Exploring Landscape Change in Mountain Environments With the Mountain Legacy Online Image Analysis Toolkit

    Directory of Open Access Journals (Sweden)

    Mary Ellen Sanseverino

    2016-11-01

    Full Text Available Since 1996, Mountain Legacy Project (MLP researchers have been exploring change in Canada's mountain environments through the use of systematic repeat photography. With access to upwards of 120,000 systematic glass plate negatives from Canada's mountain west, the MLP field teams seek to stand where historic surveyors stood and accurately reshoot these images. The resulting image pairs are analyzed, catalogued, and made available for further research into landscape changes. In this article we suggest that repeat photography would fit well within the Future Earth research agenda. We go on to introduce the Image Analysis Toolkit (IAT, which provides interactive comparative image visualization and analytics for a wide variety of ecological, geological, fluvial, and human phenomena. The toolkit is based on insights from recent research on repeat photography and features the following: user-controlled ability to compare, overlay, classify, scale, fade, draw, and annotate images; production of comparative statistics on user-defined categories (eg percentage of ice cover change in each image pair; and different ways to visualize change in the image pairs. The examples presented here utilize MLP image pairs, but the toolkit is designed to be used by anyone with their own comparative images as well as those in the MLP collection. All images and software are under Creative Commons copyright and are open access for noncommercial use via the Mountain Legacy Explorer website. The IAT is at the beginning of its software life cycle and will continue to develop features required by those who use repeat photography to discover change in mountain environments.

  4. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Science.gov (United States)

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  5. OpenDBDDAS Toolkit: Secure MapReduce and Hadoop-like Systems

    KAUST Repository

    Fabiano, Enrico

    2015-06-01

    The OpenDBDDAS Toolkit is a software framework to provide support for more easily creating and expanding dynamic big data-driven application systems (DBDDAS) that are common in environmental systems, many engineering applications, disaster management, traffic management, and manufacturing. In this paper, we describe key features needed to implement a secure MapReduce and Hadoop-like system for high performance clusters that guarantees a certain level of privacy of data from other concurrent users of the system. We also provide examples of a secure MapReduce prototype and compare it to another high performance MapReduce, MR-MPI.

  6. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM).

  7. Development of the eHealth Literacy Assessment Toolkit, eHLA.

    Science.gov (United States)

    Furstrand, Dorthe; Kayser, Lars

    2015-01-01

    In a world with rising focus on the use of eHealth, the match between the competences of the individual and the demands of eHealth systems becomes increasingly important, thus making assessment of eHealth literacy as a measure of user competences a vital element. We propose the eHealth Literacy Assessment toolkit, eHLA, evaluating the user by seven scales: computer familiarity, confidence, incentive and performance as well as functional health literacy, health literacy self-assessment and health literacy performance, as a first step toward development of technology that accommodates the literacy level of the user.

  8. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    Science.gov (United States)

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  9. Application of the SHARP Toolkit to Sodium-Cooled Fast Reactor Challenge Problems

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Yu, Y. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Kim, T. K. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2017-09-30

    The Simulation-based High-efficiency Advanced Reactor Prototyping (SHARP) toolkit is under development by the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign of the U.S. Department of Energy, Office of Nuclear Energy. To better understand and exploit the benefits of advanced modeling simulations, the NEAMS Campaign initiated the “Sodium-Cooled Fast Reactor (SFR) Challenge Problems” task, which include the assessment of hot channel factors (HCFs) and the demonstration of zooming capability using the SHARP toolkit. If both challenge problems are resolved through advanced modeling and simulation using the SHARP toolkit, the economic competitiveness of a SFR can be significantly improved. The efforts in the first year of this project focused on the development of computational models, meshes, and coupling procedures for multi-physics calculations using the neutronics (PROTEUS) and thermal-hydraulic (Nek5000) components of the SHARP toolkit, as well as demonstration of the HCF calculation capability for the 100 MWe Advanced Fast Reactor (AFR-100) design. Testing the feasibility of the SHARP zooming capability is planned in FY 2018. The HCFs developed for the earlier SFRs (FFTF, CRBR, and EBR-II) were reviewed, and a subset of these were identified as potential candidates for reduction or elimination through high-fidelity simulations. A one-way offline coupling method was used to evaluate the HCFs where the neutronics solver PROTEUS computes the power profile based on an assumed temperature, and the computational fluid dynamics solver Nek5000 evaluates the peak temperatures using the neutronics power profile. If the initial temperature profile used in the neutronics calculation is reasonably accurate, the one-way offline method is valid because the neutronics power profile has weak dependence on small temperature variation. In order to get more precise results, the proper temperature profile for initial neutronics calculations was obtained from the

  10. Beyond cost-benefit: developing a complete toolkit for adaptation decisions

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Rachel; Chambwera, Muyeye

    2010-06-15

    Cost-benefit analysis has important uses – and crucial blind spots. It represents only one of several economic tools that can be used to assess options for adapting to climate change in developing countries. The Nairobi Work Programme would best serve governments by considering not just cost-benefit approaches, but the entire range of tools. By developing a 'toolkit' that helps users choose from a variety of evaluation methods, we can support adaptation decisions that promote equity, put local people in control and allow for dynamic responses to climate change as it unfolds.

  11. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  12. The development of a standard training toolkit for research studies that recruit pregnant women in labour.

    Science.gov (United States)

    Kenyon, Sara; Sears, Jackie; Reay, Hannah

    2013-10-30

    Recruitment of pregnant women in labour to clinical trials poses particular challenges. Interpretation of regulation lacks consistency or clarity and variation occurs as to the training required by clinicians to safely contribute to the conduct of intrapartum studies. The Royal College of Obstetricians and Gynaecologists Intrapartum Clinical Study Group initiated the development of a pragmatic, proportionate and standardised toolkit for training clinical staff that complies with both regulatory and clinician requirements and has been peer-reviewed. This approach may be useful to researchers in acute care settings that necessitate the integration of research, routine clinical practice and compliance with regulation.

  13. PS1-29: Resources to Facilitate Multi-site Collaboration: the PRIMER Research Toolkit

    Science.gov (United States)

    Greene, Sarah; Thompson, Ella; Baldwin, Laura-Mae; Neale, Anne Victoria; Dolor, Rowena

    2010-01-01

    repository: www.ResearchToolkit.org, which is comprised of over 120 distinct resources. Conclusions: We are disseminating the ResearchToolkit website via academic and media channels, and identifying options for making it a sustainable resource. Given the dynamic nature of the research enterprise, maintenance and accuracy of a web-based resource is challenging. Still, the positive response to the toolkit suggests that there is high interest in sustaining it. We will demonstrate the Toolkit as part of this conference.

  14. How to create an interface between UrQMD and Geant4 toolkit

    CERN Document Server

    Abdel-Waged, Khaled; Uzhinskii, V.V.

    2012-01-01

    An interface between the UrQMD-1.3cr model (version 1.3 for cosmic air showers) and the Geant4 transport toolkit has been developed. Compared to the current Geant4 (hybrid) hadronic models, this provides the ability to simulate at the microscopic level hadron, nucleus, and anti-nucleus interactions with matter from 0 to 1 TeV with a single transport code. This document provides installation requirements and instructions, as well as class and member function descriptions of the software.

  15. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    Science.gov (United States)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  16. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button

    Science.gov (United States)

    2010-01-01

    Background There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. Methods The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS’ generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This ‘model-driven’ method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. Results In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist’s satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical

  17. Development of the eHealth Literacy Assessment Toolkit, eHLA

    DEFF Research Database (Denmark)

    Lauritzen, Dorthe Furstrand; Kayser, Lars

    2015-01-01

    In a world with rising focus on the use of eHealth, the match between the competences of the individual and the demands of eHealth systems becomes increasingly important, thus making assessment of eHealth literacy as a measure of user competences a vital element. We propose the eHealth Literacy...... Assessment toolkit, eHLA, evaluating the user by seven scales: computer familiarity, confidence, incentive and performance as well as functional health literacy, health literacy self-assessment and health literacy performance, as a first step toward development of technology that accommodates the literacy...

  18. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  19. Implementing and Optimizing of Entire System Toolkit of VLIW DSP Processors for Embedded Sensor-Based Systems

    Directory of Open Access Journals (Sweden)

    Xu Yang

    2015-01-01

    Full Text Available VLIW DSPs can largely enhance the Instruction-Level Parallelism, providing the capacity to meet the performance and energy efficiency requirement of sensor-based systems. However, the exploiting of VLIW DSPs in sensor-based domain has imposed a heavy challenge on software toolkit design. In this paper, we present our methods and experiences to develop system toolkit flows for a VLIW DSP, which is designed dedicated to sensor-based systems. Our system toolkit includes compiler, assembler, linker, debugger, and simulator. We have presented our experimental results in the compiler framework by incorporating several state-of-the-art optimization techniques for this VLIW DSP. The results indicate that our framework can largely enhance the performance and energy consumption against the code generated without it.

  20. A MultiSite Gateway Toolkit for Rapid Cloning of Vertebrate Expression Constructs with Diverse Research Applications.

    Science.gov (United States)

    Fowler, Daniel K; Stewart, Scott; Seredick, Steve; Eisen, Judith S; Stankunas, Kryn; Washbourne, Philip

    2016-01-01

    Recombination-based cloning is a quick and efficient way to generate expression vectors. Recent advancements have provided powerful recombinant DNA methods for molecular manipulations. Here, we describe a novel collection of three-fragment MultiSite Gateway cloning system-compatible vectors providing expanded molecular tools for vertebrate research. The components of this toolkit encompass a broad range of uses such as fluorescent imaging, dual gene expression, RNA interference, tandem affinity purification, chemically-inducible dimerization and lentiviral production. We demonstrate examples highlighting the utility of this toolkit for producing multi-component vertebrate expression vectors with diverse primary research applications. The vectors presented here are compatible with other Gateway toolkits and collections, facilitating the rapid generation of a broad range of innovative DNA constructs for biological research.

  1. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Energy Technology Data Exchange (ETDEWEB)

    Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Yarba, Julia [Fermilab; Kelsey, Michael [SLAC; Wright, Dennis H. [SLAC

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  2. "Weaving balance into life": Development and cultural adaptation of a cancer symptom management toolkit for Southwest American Indians.

    Science.gov (United States)

    Hodge, Felicia Schanche; Itty, Tracy Line; Cadogan, Mary P; Martinez, Fernando

    2012-06-01

    Self-management of cancer symptoms has the potential to decrease the suffering of cancer survivors while improving their health and quality of life. For many racial/ethnic groups, culturally appropriate self-management instruction is not readily available. This paper reports on the first symptom management toolkit developed for American Indian cancer survivors. Part of a larger research study, a three-phase project tested a cancer symptom self-management toolkit to be responsive to the unique learning and communication needs of American Indians in the Southwest U.S.A. American Indian cancer survivors and family members participated in 13 focus groups to identify cultural concepts of cancer and illness beliefs, communication styles, barriers, and recommendations for self-management techniques. Sessions were audiotaped and transcriptions were coded using grounded theory. Participants expressed a need for an overview of cancer, tips on management of common symptoms, resources in their communities, and suggestions for how to communicate with providers and others. The "Weaving balance into life" toolkit is comprised of a self-help guide, resource directory, and video. Preferred presentation style and content for the toolkit were pilot tested. American Indian survivors favor educational materials that provide information on symptom management and are tailored to their culture and beliefs. Suggestions for adapting the toolkit materials for other American Indian populations are made. Many cancer survivors lack effective self-management techniques for symptoms, such as pain, fatigue, and depression. The toolkit promotes self-management strategies for survivors and provides family members/caregivers tangible ways to offer support.

  3. “Weaving Balance into Life”: Development and cultural adaptation of a cancer symptom management toolkit for Southwest American Indians

    Science.gov (United States)

    Itty, Tracy Line; Cadogan, Mary P.; Martinez, Fernando

    2012-01-01

    Introduction Self-management of cancer symptoms has the potential to decrease the suffering of cancer survivors while improving their health and quality of life. For many racial/ethnic groups, culturally appropriate self-management instruction is not readily available. This paper reports on the first symptom management toolkit developed for American Indian cancer survivors. Methods Part of a larger research study, a three-phase project tested a cancer symptom self-management toolkit to be responsive to the unique learning and communication needs of American Indians in the Southwest USA. American Indian cancer survivors and family members participated in 13 focus groups to identify cultural concepts of cancer and illness beliefs, communication styles, barriers, and recommendations for self-management techniques. Sessions were audiotaped and transcriptions were coded using Grounded Theory. Results Participants expressed a need for an overview of cancer, tips on management of common symptoms, resources in their communities, and suggestions for how to communicate with providers and others. The “Weaving Balance into Life” toolkit is comprised of a self-help guide, resource directory, and video. Preferred presentation style and content for the toolkit were pilot tested. Discussion/conclusions American Indian survivors favor educational materials that provide information on symptom management and are tailored to their culture and beliefs. Suggestions for adapting the toolkit materials for other American Indian populations are made. Implications for cancer survivors Many cancer survivors lack effective self-management techniques for symptoms, such as pain, fatigue, and depression. The toolkit promotes self-management strategies for survivors and provides family members/caregivers tangible ways to offer support. PMID:22160662

  4. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  5. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    Science.gov (United States)

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that

  6. Clinical Trial of a Home Safety Toolkit for Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Kathy J. Horvath

    2013-01-01

    Full Text Available This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n=60 received the Home Safety Toolkit (HST, including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n=48 received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P≤0.001, caregiver strain at P≤0.001, and caregiver self-efficacy at P=0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P≤0.001. The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care.

  7. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  8. Adding Impacts and Mitigation Measures to OpenEI's RAPID Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, Erin

    2017-05-01

    The Open Energy Information platform hosts the Regulatory and Permitting Information Desktop (RAPID) Toolkit to provide renewable energy permitting information on federal and state regulatory processes. One of the RAPID Toolkit's functions is to help streamline the geothermal permitting processes outlined in the National Environmental Policy Act (NEPA). This is particularly important in the geothermal energy sector since each development phase requires separate land analysis to acquire exploration, well field drilling, and power plant construction permits. Using the Environmental Assessment documents included in RAPID's NEPA Database, the RAPID team identified 37 resource categories that a geothermal project may impact. Examples include impacts to geology and minerals, nearby endangered species, or water quality standards. To provide federal regulators, project developers, consultants, and the public with typical impacts and mitigation measures for geothermal projects, the RAPID team has provided overview webpages of each of these 37 resource categories with a sidebar query to reference related NEPA documents in the NEPA Database. This project is an expansion of a previous project that analyzed the time to complete NEPA environmental review for various geothermal activities. The NEPA review not only focused on geothermal projects within the Bureau of Land Management and U.S. Forest Service managed lands, but also projects funded by the Department of Energy. Timeline barriers found were: extensive public comments and involvement; content overlap in NEPA documents, and discovery of impacted resources such as endangered species or cultural sites.

  9. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.

  10. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    Science.gov (United States)

    Wagh, Aditi; Wilensky, Uri

    2017-10-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  11. REST: a toolkit for resting-state functional magnetic resonance imaging data processing.

    Science.gov (United States)

    Song, Xiao-Wei; Dong, Zhang-Ye; Long, Xiang-Yu; Li, Su-Fang; Zuo, Xi-Nian; Zhu, Chao-Zhe; He, Yong; Yan, Chao-Gan; Zang, Yu-Feng

    2011-01-01

    Resting-state fMRI (RS-fMRI) has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST). REST was developed in MATLAB with graphical user interface (GUI). After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF), and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.). REST is an open-source package and is freely available at http://www.restfmri.net.

  12. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  13. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  14. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  15. KAT: a K-mer analysis toolkit to quality control NGS datasets and genome assemblies.

    Science.gov (United States)

    Mapleson, Daniel; Garcia Accinelli, Gonzalo; Kettleborough, George; Wright, Jonathan; Clavijo, Bernardo J

    2017-02-15

    De novo assembly of whole genome shotgun (WGS) next-generation sequencing (NGS) data benefits from high-quality input with high coverage. However, in practice, determining the quality and quantity of useful reads quickly and in a reference-free manner is not trivial. Gaining a better understanding of the WGS data, and how that data is utilized by assemblers, provides useful insights that can inform the assembly process and result in better assemblies. We present the K-mer Analysis Toolkit (KAT): a multi-purpose software toolkit for reference-free quality control (QC) of WGS reads and de novo genome assemblies, primarily via their k-mer frequencies and GC composition. KAT enables users to assess levels of errors, bias and contamination at various stages of the assembly process. In this paper we highlight KAT's ability to provide valuable insights into assembly composition and quality of genome assemblies through pairwise comparison of k-mers present in both input reads and the assemblies. KAT is available under the GPLv3 license at: https://github.com/TGAC/KAT . bernardo.clavijo@earlham.ac.uk. Supplementary data are available at Bioinformatics online.

  16. OpenSkies: a commercial 3D distributed visualization and simulation toolkit

    Science.gov (United States)

    Cobb, Paul N.; Jacobus, Charles J.; Haanpaa, Douglas

    2000-05-01

    A growing need for more advanced training capabilities and the proliferation of government standards into the commercial market has inspired Cybernet to create an advanced, distributed 3D Simulation Toolkit. This system, called OpenSkies, is a truly open, realistic distributed system for 3D visualization and simulation. One of the main strengths of OpenSkies is its capability for data collection and analysis. Cybernet's Data Collection and Analysis Environment is closely integrated with OpenSkies to produce a unique, quantitative, performance-based measurement system. This system provides the capability for training students and operators on any complex equipment or system that can be created in a simulated world. OpenSkies is based on the military standard HLA networking architecture. This architecture allows thousands of users to interact in the same world across the Internet. Cybernet's OpenSkies simulation system brings the power and versatility of the OpenGL programming API to the simulation and gaming worlds. On top of this, Cybernet has developed an open architecture that allows the developer to produce almost any kind of new technique in their simulation. Overall, these capabilities deliver a versatile and comprehensive toolkit for simulation and distributed visualization.

  17. A Java/CGI approach to developing a geographic virtual reality toolkit on the Internet

    Science.gov (United States)

    Huang, Bo; Lin, Hui

    2002-02-01

    A Java/common gateway interface (CGI) approach is employed to design a toolkit for interactively building up virtual environments from existing geographical information system (GIS) databases. This approach takes advantage of both the Java and CGI approaches, providing a flexible and user friendly interface, while making better use of the power of the server. It is also beneficial to the balancing of workloads on both client and server sides. A prototype, called GeoVR, has been implemented by extending ArcView Internet Map Server on its client side by the Java language and correspondingly, on its server-side by the Avenue language. The GeoVR server is responsible for generating 3D scenes in terms of parameter values sent from the Java client. The 3D scenes are then transformed to VRML models, and delivered to the WWW browser for display and navigation. This toolkit, allowing users to interact with 2D GIS data on the Internet and create perspective views from these data on the fly, illustrates how to extend an existing Internet GIS into a more powerful virtual GIS.

  18. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Directory of Open Access Journals (Sweden)

    Damien M O'Halloran

    Full Text Available Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis.To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines.phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  19. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Science.gov (United States)

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  20. Iterative user centered design for development of a patient-centered fall prevention toolkit.

    Science.gov (United States)

    Katsulis, Zachary; Ergai, Awatef; Leung, Wai Yin; Schenkel, Laura; Rai, Amisha; Adelman, Jason; Benneyan, James; Bates, David W; Dykes, Patricia C

    2016-09-01

    Due to the large number of falls that occur in hospital settings, inpatient fall prevention is a topic of great interest to patients and health care providers. The use of electronic decision support that tailors fall prevention strategy to patient-specific risk factors, known as Fall T.I.P.S (Tailoring Interventions for Patient Safety), has proven to be an effective approach for decreasing hospital falls. A paper version of the Fall T.I.P.S toolkit was developed primarily for hospitals that do not have the resources to implement the electronic solution; however, more work is needed to optimize the effectiveness of the paper version of this tool. We examined the use of human factors techniques in the redesign of the existing paper fall prevention tool with the goal of increasing ease of use and decreasing inpatient falls. The inclusion of patients and clinical staff in the redesign of the existing tool was done to increase adoption of the tool and fall prevention best practices. The redesigned paper Fall T.I.P.S toolkit showcased a built in clinical decision support system and increased ease of use over the existing version. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. The hard parts of the Cambrian Explosion: a palaeobiological approach to testing the 'biomineralization toolkit' hypothesis

    Science.gov (United States)

    Murdock, Duncan

    2017-04-01

    The Cambrian Explosion was the most dramatic event in the history of animal life on Earth, yet the processes underlying it are poorly understood not least the role of biomineralization in driving this fundamental evolutionary episode. One explanation for this event, based on observations on the developmental and molecular biology of modern organisms, is that all animals inherited a common 'toolkit' of genes, independently co-opted to similar tasks, including building skeletons. This initially imprecise 'toolkit' was subsequently honed by the acquisition of more and more complex gene regulatory networks. This predicts that animal skeletons, and by inference their organic frameworks, should exhibit a higher degree of morphological plasticity at their origin than later in their evolutionary history - a prediction that is virtually untested. Here I set out a new approach to testing this prediction, by quantifying the phenotypic variation displayed in fossil remains of some of the earliest animal skeletons, over multiple scales from microscopic variations in their component biominerals to how skeletons themselves are put together. This approach will provide direct evidence to test the importance of the genetic basis of the skeleton in the origin of the animals, making a significant contribution to our understanding of this crucial event in the history of life on Earth, including the evolution of our own ancestors.

  2. The Zoltan and Isorropia Parallel Toolkits for Combinatorial Scientific Computing: Partitioning, Ordering and Coloring

    Directory of Open Access Journals (Sweden)

    Erik G. Boman

    2012-01-01

    Full Text Available Partitioning and load balancing are important problems in scientific computing that can be modeled as combinatorial problems using graphs or hypergraphs. The Zoltan toolkit was developed primarily for partitioning and load balancing to support dynamic parallel applications, but has expanded to support other problems in combinatorial scientific computing, including matrix ordering and graph coloring. Zoltan is based on abstract user interfaces and uses callback functions. To simplify the use and integration of Zoltan with other matrix-based frameworks, such as the ones in Trilinos, we developed Isorropia as a Trilinos package, which supports most of Zoltan's features via a matrix-based interface. In addition to providing an easy-to-use matrix-based interface to Zoltan, Isorropia also serves as a platform for additional matrix algorithms. In this paper, we give an overview of the Zoltan and Isorropia toolkits, their design, capabilities and use. We also show how Zoltan and Isorropia enable large-scale, parallel scientific simulations, and describe current and future development in the next-generation package Zoltan2.

  3. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Wei [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zhang, Ming, E-mail: zhangming@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zhang, Jing; Zhuang, Ge [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); College of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2013-11-15

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system.

  4. Transposome: a toolkit for annotation of transposable element families from unassembled sequence reads.

    Science.gov (United States)

    Staton, S Evan; Burke, John M

    2015-06-01

    Transposable elements (TEs) can be found in virtually all eukaryotic genomes and have the potential to produce evolutionary novelty. Despite the broad taxonomic distribution of TEs, the evolutionary history of these sequences is largely unknown for many taxa due to a lack of genomic resources and identification methods. Given that most TE annotation methods are designed to work on genome assemblies, we sought to develop a method to provide a fine-grained classification of TEs from DNA sequence reads. Here, we present a toolkit for the efficient annotation of TE families from low-coverage whole-genome shotgun (WGS) data, enabling the rapid identification of TEs in a large number of taxa. We compared our software, Transposome, with other approaches for annotating repeats from WGS data, and we show that it offers significant improvements in run time and produces more precise estimates of genomic repeat abundance. Transposome may also be used as a general toolkit for working with Next Generation Sequencing (NGS) data, and for constructing custom genome analysis pipelines. The source code for Transposome is freely available (http://sestaton.github.io/Transposome), implemented in Perl and is supported on Linux. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. REST: a toolkit for resting-state functional magnetic resonance imaging data processing.

    Directory of Open Access Journals (Sweden)

    Xiao-Wei Song

    Full Text Available Resting-state fMRI (RS-fMRI has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST. REST was developed in MATLAB with graphical user interface (GUI. After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF, and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.. REST is an open-source package and is freely available at http://www.restfmri.net.

  6. A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    Science.gov (United States)

    Senzacqua, M.; Schiavi, A.; Patera, V.; Pioli, S.; Battistoni, G.; Ciocca, M.; Mairani, A.; Magro, G.; Molinelli, S.

    2017-10-01

    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement.

  7. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2012-01-01

    Full Text Available Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n=6 physicians defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n=29 health professionals revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF and the positive and negative components of the multilevel person-centered integrative diagnosis model.

  8. eVITAL: a preliminary taxonomy and electronic toolkit of health-related habits and lifestyle.

    Science.gov (United States)

    Salvador-Carulla, Luis; Walsh, Carolyn Olson; Alonso, Federico; Gómez, Rafael; de Teresa, Carlos; Cabo-Soler, José Ricardo; Cano, Antonio; Ruiz, Mencía

    2012-01-01

    To create a preliminary taxonomy and related toolkit of health-related habits (HrH) following a person-centered approach with a focus on primary care. From 2003-2009, a working group (n = 6 physicians) defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n = 29 health professionals) revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF) and the positive and negative components of the multilevel person-centered integrative diagnosis model.

  9. Toolkit for determination of dose-response relations, validation of radiobiological parameters and treatment plan optimization based on radiobiological measures.

    Science.gov (United States)

    Mavroidis, Panayiotis; Tzikas, Athanasios; Papanikolaou, Nikos; Lind, Bengt K

    2010-10-01

    evaluation are compared against dosimetric criteria. The presented toolkit appears to be very convenient and efficient for clinical implementation of radiobiological modeling. It can also be used for the development of a clinical data and health information database for assisting the performance of epidemiological studies and the collaboration between different institutions within research and clinical frameworks.

  10. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    Science.gov (United States)

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  11. Starting from Scratch: Greening Your Game Day--The Collegiate Football Sustainable Materials Management Toolkit. Version 1.0

    Science.gov (United States)

    Association for the Advancement of Sustainability in Higher Education, 2011

    2011-01-01

    The "Collegiate Football Sustainable Materials Management Toolkit" was researched by student interns in the Virginia Tech Office of Energy & Sustainability, developed in collaboration with the US EPA (US Environmental Protection Agency) and a national panel of technical experts from universities across the nation, and driven forward…

  12. Psy Toolkit: A Novel Web-Based Method for Running Online Questionnaires and Reaction-Time Experiments

    Science.gov (United States)

    Stoet, Gijsbert

    2017-01-01

    This article reviews PsyToolkit, a free web-based service designed for setting up, running, and analyzing online questionnaires and reaction-time (RT) experiments. It comes with extensive documentation, videos, lessons, and libraries of free-to-use psychological scales and RT experiments. It provides an elaborate interactive environment to use (or…

  13. The ProteoRed MIAPE web toolkit: a user-friendly framework to connect and share proteomics standards.

    Science.gov (United States)

    Medina-Aunon, J Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R; Paradela, Alberto; Albar, Juan P

    2011-10-01

    The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/.

  14. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Energy Technology Data Exchange (ETDEWEB)

    Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-02-17

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  15. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  16. Preparing for the Flu (Including 2009 H1N1 Flu): A Communication Toolkit for Schools (Grades K-12)

    Science.gov (United States)

    Centers for Disease Control and Prevention, 2010

    2010-01-01

    The purpose of "Preparing for the Flu: A Communication Toolkit for Schools" is to provide basic information and communication resources to help school administrators implement recommendations from CDC's (Centers for Disease Control and Prevention) Guidance for State and Local Public Health Officials and School Administrators for School (K-12)…

  17. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL

    Directory of Open Access Journals (Sweden)

    Walsh Carolyn O

    2010-03-01

    Full Text Available Abstract Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672

  18. New toolkit to measure quality of person-centered care: development and pilot evaluation with nursing home communities.

    Science.gov (United States)

    Van Haitsma, Kimberly; Crespy, Scott; Humes, Sarah; Elliot, Amy; Mihelic, Adrienne; Scott, Carol; Curyto, Kim; Spector, Abby; Eshraghi, Karen; Duntzee, Christina; Heid, Allison Reamy; Abbott, Katherine

    2014-09-01

    Increasingly, nursing home (NH) providers are adopting a person-centered care (PCC) philosophy; yet, they currently lack methods to measure their progress toward this goal. Few PCC tools meet criteria for ease of use and feasibility in NHs. The purpose of this article is to report on the development of the concept and measurement of preference congruence among NH residents (phase 1), its refinement into a set of quality indicators by Advancing Excellence in America's Nursing Homes (phase 2), and its pilot evaluation in a sample of 12 early adopting NHs prior to national rollout (phase 3). The recommended toolkit for providers to use to measure PCC consists of (1) interview materials for 16 personal care and activity preferences from Minimum Data Set 3.0, plus follow-up questions that ask residents how satisfied they are with fulfillment of important preferences; and (2) an easy to use Excel spreadsheet that calculates graphic displays of quality measures of preference congruence and care conference attendance for an individual, household or NH. Twelve NHs interviewed residents (N = 146) using the toolkit; 10 also completed a follow-up survey and 9 took part in an interview evaluating their experience. NH staff gave strong positive ratings to the toolkit. All would recommend it to other NHs. Staff reported that the toolkit helped them identify opportunities to improve PCC (100%), and found that the Excel tool was comprehensive (100%), easy to use (90%), and provided high quality information (100%). Providers anticipated using the toolkit to strengthen staff training as well as to enhance care planning, programming and quality improvement. The no-cost PCC toolkit provides a new means to measure the quality of PCC delivery. As of February 2014, over 700 nursing homes have selected the Advancing Excellence in America's Nursing Homes PCC goal as a focus for quality improvement. The toolkit enables providers to incorporate quality improvement by moving beyond anecdote

  19. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. PyEvolve: a toolkit for statistical modelling of molecular evolution.

    Science.gov (United States)

    Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A

    2004-01-05

    Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used

  1. PyEvolve: a toolkit for statistical modelling of molecular evolution

    Directory of Open Access Journals (Sweden)

    Wakefield Matthew J

    2004-01-01

    Full Text Available Abstract Background Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences – ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Results Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from ~10 days to ~6 hours. Conclusion PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the

  2. Dual-polarization phase shift processing with the Python ARM Radar Toolkit

    Science.gov (United States)

    Collis, S. M.; Lang, T. J.; Mühlbauer, K.; Helmus, J.; North, K.

    2016-12-01

    Weather radars that measure backscatter returns at two orthogonal polarizations can give unique insight into storm macro and microphysics. Phase shift between the two polarizations caused by anisotropy in the liquid water path can be used as a constraint in rainfall rate and drop size distribution retrievals, and has the added benefit of being robust to attenuation and radar calibration. The measurement is complicated, however, by the impact of phase shift on backscatter in the presence of large drops and when the pulse volume is not filled uniformly by scatterers (known as partial beam filling). This has led to a signal processing challenge of separating the underlying desired signal from the transient signal, a challenge that has attracted many diverse solutions. To this end, the Python-ARM Radar Toolkit (Py-ART) [1] becomes increasingly important. By providing an open architecture for implementation of retrieval techniques, Py-ART has attracted three very different approaches to the phase processing problem: a fully variational technique, a finite impulse response filter technique [2], and a technique based on a linear programming [3]. These either exist within the toolkit or in another open source package that uses the Py-ART architecture. This presentation will provide an overview of differential phase and specific differential phase observed at C- and S-band frequencies, the signal processing behind the three aforementioned techniques, and some examples of their application. The goal of this presentation is to highlight the importance of open source architectures such as Py-ART for geophysical retrievals. [1] Helmus, J.J. & Collis, S.M., (2016). The Python ARM Radar Toolkit (Py-ART), a Library for Working with Weather Radar Data in the Python Programming Language. JORS. 4(1), p.e25. DOI: http://doi.org/10.5334/jors.119[2] Timothy J. Lang, David A. Ahijevych, Stephen W. Nesbitt, Richard E. Carbone, Steven A. Rutledge, and Robert Cifelli, 2007: Radar

  3. Implementation of a Brief Treatment Counseling Toolkit in Federally Qualified Healthcare Centers: Patient and Clinician Utilization and Satisfaction.

    Science.gov (United States)

    Brooks, Adam C; Chambers, Jaclyn E; Lauby, Jennifer; Byrne, Elizabeth; Carpenedo, Carolyn M; Benishek, Lois A; Medvin, Rachel; Metzger, David S; Kirby, Kimberly C

    2016-01-01

    The need to integrate behavioral health care within medical settings is widely recognized, and integrative care approaches are associated with improved outcomes for a range of disorders. As substance use treatment integration efforts expand within primary care settings, training behavioral health providers in evidence-based brief treatment models that are cost-effective and easily fit within the medical flow is essential. Guided by principles drawn from Diffusion of Innovations theory (Rogers, 2003) and the Consolidated Framework of Implementation Research (Damschroder et al., 2009), we adapted elements of Motivational Enhancement Therapy, cognitive-behavioral therapy, and 12-step facilitation into a brief counseling toolkit. The toolkit is a menu driven assortment of 35 separate structured clinical interventions that each include client takeaway resources to reinforce brief clinical contacts. We then implemented this toolkit in the context of a randomized clinical trial in three Federally Qualified Healthcare Centers. Behavioral Health Consultants (BHCs) used a pre-screening model wherein 10,935 patients received a brief initial screener, and 2011 received more in-depth substance use screening. Six hundred patients were assigned to either a single session brief intervention or an expanded brief treatment encompassing up to five additional sessions. We conducted structured interviews with patients, medical providers, and BHCs to obtain feedback on toolkit implementation. On average, patients assigned to brief treatment attended 3.29 sessions. Fifty eight percent of patients reported using most or all of the educational materials provided to them. Patients assigned to brief treatment reported that the BHC sessions were somewhat more helpful than did patients assigned to a single session brief intervention (p=.072). BHCs generally reported that the addition of the toolkit was helpful to their work in delivering screening and brief treatment. This work is significant

  4. Granger causality analysis implementation on MATLAB: a graphic user interface toolkit for fMRI data processing.

    Science.gov (United States)

    Zang, Zhen-Xiang; Yan, Chao-Gan; Dong, Zhang-Ye; Huang, Jian; Zang, Yu-Feng

    2012-01-30

    A lot of functional magnetic resonance imaging (fMRI) studies have indicated that Granger causality analysis (GCA) is a suitable method to reveal causal effect among brain regions. Based on another MATLAB GUI toolkit, Resting State fMRI Data Analysis Toolkit (REST), we implemented GCA on MATLAB as a graphical user interface (GUI) toolkit. This toolkit, namely REST-GCA, could output both the residual-based F and the signed-path coefficient. REST-GCA also intergrates a programme that could transform the distribution of residual-based F to approximately normal distribution and then permit parametric statistical inference at group level. Using REST-GCA, we tested the causal effect of the right frontal-insular cortex (rFIC) onto each voxel in the whole brain, and vice versa, each voxel in the whole brain on the rFIC, in a voxel-wise way in a resting-state fMRI dataset from 30 healthy college students. Using Jarque-Bera goodness-of-fit test and the Lilliefors goodness-of-fit test, we found that the transformation from F to F' and the further standardization from F' to Z score substantially improved the normality. The results of one sample t-tests on Z score showed bi-directional positive causal effect between rFIC and the dorsal anterior cingulate cortex (dACC). One sample t-tests on the signed-path coefficients showed positive causal effect from rFIC to dACC but negative from dACC to rFIC. All these results indicate that REST-GCA may be useful toolkit for caudal analysis of fMRI data. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. GEANT 4: an Object-Oriented toolkit for simulation in HEP

    CERN Multimedia

    Kent, P; Sirotenko, V; Komogorov, M; Pavliouk, A; Greeniaus, G L; Kayal, P I; Routenburg, P; Tanaka, S; Duellmann, D; Innocente, V; Paoli, S; Ranjard, F; Riccardi, F; Ruggier, M; Shiers, J; Egli, S; Kimura, A; Urban, P; Prior, S; Walkden, A; Forti, A; Magni, S; Strahl, K; Kokoulin, R; Braune, K; Volcker, C; Ullrich, T; Takahata, M; Nieminen, P; Ballocchi, G; Mora De Freitas, P; Verderi, M; Rybine, A; Langeveld, W; Nagamatsu, M; Hamatsu, R; Katayama, N; Chuma, J; Felawka, L; Gumplinger, P; Axen, D

    2002-01-01

    %RD44 %title\\\\ \\\\The GEANT4 software has been developed by a world-wide collaboration of about 100 scientists from over 40 institutions and laboratories participating in more than 10 experiments in Europe, Russia, Japan, Canada, and the United States. The GEANT4 detector simulation toolkit has been designed for the next generation of High Energy Physics (HEP) experiments, with primary requirements from the LHC, the CP violation, and the heavy ions experiments. In addition, GEANT4 also meets the requirements from the space and medical communities, thanks to very low energy extensions developed in a joint project with the European Space Agency (ESA). GEANT4 has exploited advanced software engineering techniques (for example PSS-05) and Object-Oriented technology to improve the validation process of the physics results, and in the same time to make possible the distributed software design and development in the world-wide collaboration. Fifteen specialised working groups have been responsible for fields as diver...

  6. On the Design of a Parallel Object-Oriented Data Mining Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Kamath, C.; Cantu-Paz, E.

    2000-05-17

    As data mining techniques are applied to ever larger data sets, it is becoming clear that parallel processors will play an important role in reducing the turn around time for data analysis. In this paper, we describe the design of a parallel object-oriented toolkit for mining scientific data sets. After a brief discussion of our design goals, we describe our overall system design that uses data mining to find useful information in raw data in an iterative and interactive manner. Using decision trees as an example, we illustrate how the need to support flexibility and extensibility can make the parallel implementation of our algorithms very challenging. As this is work in progress, we also describe the solution approaches we are considering to address these challenges.

  7. Family meetings made simpler: a toolkit for the intensive care unit.

    Science.gov (United States)

    Nelson, Judith E; Walker, Amy S; Luhrs, Carol A; Cortez, Therese B; Pronovost, Peter J

    2009-12-01

    Although a growing body of evidence has associated the intensive care unit (ICU) family meeting with important, favorable outcomes for critically ill patients, their families, and health care systems, these meetings often fail to occur in a timely, effective, and reliable way. In this article, we describe 3 specific tools that we have developed as prototypes to promote more successful implementation of family meetings in the ICU: (1) a family meeting planner, (2) a meeting guide for families, and (3) a family meeting documentation template. We describe the essential features of these tools and ways that they might be adapted to meet the local needs of individual ICUs and to maximize acceptability and use. We also discuss the role of such tools in structuring a performance improvement initiative. Just as simple tools have helped reduce bloodstream infections, our hope is that the toolkit presented here will help critical care teams to meet the important communication needs of ICU families.

  8. A Flooding Induced Station Blackout Analysis for a Pressurized Water Reactor Using the RISMC Toolkit

    Directory of Open Access Journals (Sweden)

    Diego Mandelli

    2015-01-01

    Full Text Available In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code called NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. In addition, the impact of power uprate is determined in terms of both core damage probability and safety margins.

  9. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Merzari, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Obabko, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States); Tautges, Timothy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Solberg, Jerome [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferencz, Robert Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Whitesides, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  10. Integrating the protein and metabolic engineering toolkits for next-generation chemical biosynthesis.

    Science.gov (United States)

    Pirie, Christopher M; De Mey, Marjan; Jones Prather, Kristala L; Ajikumar, Parayil Kumaran

    2013-04-19

    Through microbial engineering, biosynthesis has the potential to produce thousands of chemicals used in everyday life. Metabolic engineering and synthetic biology are fields driven by the manipulation of genes, genetic regulatory systems, and enzymatic pathways for developing highly productive microbial strains. Fundamentally, it is the biochemical characteristics of the enzymes themselves that dictate flux through a biosynthetic pathway toward the product of interest. As metabolic engineers target sophisticated secondary metabolites, there has been little recognition of the reduced catalytic activity and increased substrate/product promiscuity of the corresponding enzymes compared to those of central metabolism. Thus, fine-tuning these enzymatic characteristics through protein engineering is paramount for developing high-productivity microbial strains for secondary metabolites. Here, we describe the importance of protein engineering for advancing metabolic engineering of secondary metabolism pathways. This pathway integrated enzyme optimization can enhance the collective toolkit of microbial engineering to shape the future of chemical manufacturing.

  11. Advancements in Wind Integration Study Data Modeling: The Wind Integration National Dataset (WIND) Toolkit; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C.; Hodge, B. M.; Orwig, K.; Jones, W.; Searight, K.; Getman, D.; Harrold, S.; McCaa, J.; Cline, J.; Clark, C.

    2013-10-01

    Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather prediction model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.

  12. CHASM and SNVBox: toolkit for detecting biologically important single nucleotide mutations in cancer.

    Science.gov (United States)

    Wong, Wing Chung; Kim, Dewey; Carter, Hannah; Diekhans, Mark; Ryan, Michael C; Karchin, Rachel

    2011-08-01

    Thousands of cancer exomes are currently being sequenced, yielding millions of non-synonymous single nucleotide variants (SNVs) of possible relevance to disease etiology. Here, we provide a software toolkit to prioritize SNVs based on their predicted contribution to tumorigenesis. It includes a database of precomputed, predictive features covering all positions in the annotated human exome and can be used either stand-alone or as part of a larger variant discovery pipeline. MySQL database, source code and binaries freely available for academic/government use at http://wiki.chasmsoftware.org, Source in Python and C++. Requires 32 or 64-bit Linux system (tested on Fedora Core 8,10,11 and Ubuntu 10), 2.5*≤ Python MySQL server >5.0, 60 GB available hard disk space (50 MB for software and data files, 40 GB for MySQL database dump when uncompressed), 2 GB of RAM.

  13. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    Directory of Open Access Journals (Sweden)

    Hamelryck Thomas

    2010-03-01

    Full Text Available Abstract Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs. It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations. Results The program package is freely available under the GNU General Public Licence (GPL from SourceForge http://sourceforge.net/projects/mocapy. The package contains the source for building the Mocapy++ library, several usage examples and the user manual. Conclusions Mocapy++ is especially suitable for constructing probabilistic models of biomolecular structure, due to its support for directional statistics. In particular, it supports the Kent distribution on the sphere and the bivariate von Mises distribution on the torus. These distributions have proven useful to formulate probabilistic models of protein and RNA structure in atomic detail.

  14. Application of GEANT4 radiation transport toolkit to dose calculations in anthropomorphic phantoms

    CERN Document Server

    Rodrigues, P; Peralta, L; Alves, C; Chaves, A; Lopes, M C

    2003-01-01

    In this paper we present the implementation of a dose calculation application, based on the GEANT4 Monte Carlo toolkit. Validation studies were performed with an homogeneous water phantom and an Alderson--Rando anthropomorphic phantom both irradiated with high--energy photon beams produced by a clinical linear accelerator. As input, this tool requires computer tomography images for automatic codification of voxel based geometries and phase space distributions to characterize the incident radiation field. Simulation results were compared with ionization chamber, thermoluminescent dosimetry data and commercial treatment planning system calculations. In homogeneous water phantom, overall agreement with measurements were within 1--2%. For anthropomorphic simulated setups (thorax and head irradiation) mean differences between GEANT4 and TLD measurements were less than 2%. Significant differences between GEANT4 and a semi--analytical algorithm implemented in the treatment planning system, were found in low density ...

  15. Modular toolkit for Data Processing (MDP: a Python data processing framework

    Directory of Open Access Journals (Sweden)

    Tiziano Zito

    2009-01-01

    Full Text Available Modular toolkit for Data Processing (MDP is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  16. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    Science.gov (United States)

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  17. ChemDoodle Web Components: HTML5 toolkit for chemical graphics, interfaces, and informatics.

    Science.gov (United States)

    Burger, Melanie C

    2015-01-01

    ChemDoodle Web Components (abbreviated CWC, iChemLabs, LLC) is a light-weight (~340 KB) JavaScript/HTML5 toolkit for chemical graphics, structure editing, interfaces, and informatics based on the proprietary ChemDoodle desktop software. The library uses and WebGL technologies and other HTML5 features to provide solutions for creating chemistry-related applications for the web on desktop and mobile platforms. CWC can serve a broad range of scientific disciplines including crystallography, materials science, organic and inorganic chemistry, biochemistry and chemical biology. CWC is freely available for in-house use and is open source (GPL v3) for all other uses.Graphical abstractAdd interactive 2D and 3D chemical sketchers, graphics, and spectra to websites and apps with ChemDoodle Web Components.

  18. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi

    2017-10-23

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  19. VGSC: A Web-Based Vector Graph Toolkit of Genome Synteny and Collinearity

    Directory of Open Access Journals (Sweden)

    Yiqing Xu

    2016-01-01

    Full Text Available Background. In order to understand the colocalization of genetic loci amongst species, synteny and collinearity analysis is a frequent task in comparative genomics research. However many analysis software packages are not effective in visualizing results. Problems include lack of graphic visualization, simple representation, or inextensible format of outputs. Moreover, higher throughput sequencing technology requires higher resolution image output. Implementation. To fill this gap, this paper publishes VGSC, the Vector Graph toolkit of genome Synteny and Collinearity, and its online service, to visualize the synteny and collinearity in the common graphical format, including both raster (JPEG, Bitmap, and PNG and vector graphic (SVG, EPS, and PDF. Result. Users can upload sequence alignments from blast and collinearity relationship from the synteny analysis tools. The website can generate the vector or raster graphical results automatically. We also provide a java-based bytecode binary to enable the command-line execution.

  20. The Design and Implementation of an Automated Security Compliance Toolkit: A Pedagogical Exercise

    Directory of Open Access Journals (Sweden)

    Guillermo Francia III

    2007-12-01

    Full Text Available The demand, through government regulations, for the preservation of the security, integrity, and privacy of corporate and customer information is increasing at an unprecedented pace. Government and private entities struggle to comply with these regulations through various means—both automated and manual controls. This paper presents an automated security compliance toolkit that is designed and developed using mostly open source tools to demonstrate that 1 meeting regulatory compliance does not need to be a very expensive proposition and 2 an undertaking of this magnitude could be served as a pedagogical exercise for students in the areas of collaboration, project management, software engineering, information assurance, and regulatory compliance.

  1. The Climate Resilience Toolkit: Central gateway for risk assessment and resilience planning at all governance scales

    Science.gov (United States)

    Herring, D.; Lipschultz, F.

    2016-12-01

    As people and organizations grapple with a changing climate amid a range of other factors simultaneously shifting, there is a need for credible, legitimate & salient scientific information in useful formats. In addition, an assessment framework is needed to guide the process of planning and implementing projects that allow communities and businesses to adapt to specific changing conditions, while also building overall resilience to future change. We will discuss how the U.S. Climate Resilience Toolkit (CRT) can improve people's ability to understand and manage their climate-related risks and opportunities, and help them make their communities and businesses more resilient. In close coordination with the U.S. Climate Data Initiative, the CRT is continually evolving to offer actionable authoritative information, relevant tools, and subject matter expertise from across the U.S. federal government in one easy-to-use location. The Toolkit's "Climate Explorer" is designed to help people understand potential climate conditions over the course of this century. It offers easy access to downloadable maps, graphs, and data tables of observed and projected temperature, precipitation and other decision-relevant climate variables dating back to 1950 and out to 2100. Since climate is only one of many changing factors affecting decisions about the future, it also ties climate information to a wide range of relevant variables to help users explore vulnerabilities and impacts. New topic areas have been added, such as "Fisheries," "Regions," and "Built Environment" sections that feature case studies and personal experiences in making adaptation decisions. A curated "Reports" section is integrated with semantic web capabilities to help users locate the most relevant information sources. As part of the USGCRP's sustained assessment process, the CRT is aligning with other federal activities, such as the upcoming 4th National Climate Assessment.

  2. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    Science.gov (United States)

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  3. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST)

    Science.gov (United States)

    Dowd, Scot E; Zaragoza, Joaquin; Rodriguez, Javier R; Oliver, Melvin J; Payton, Paxton R

    2005-01-01

    Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST), which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN). W.ND-BLAST provides intuitive Graphic User Interfaces (GUI) for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours) on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV) and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is freely downloadable from

  4. ChemDataExtractor: A Toolkit for Automated Extraction of Chemical Information from the Scientific Literature.

    Science.gov (United States)

    Swain, Matthew C; Cole, Jacqueline M

    2016-10-24

    The emergence of "big data" initiatives has led to the need for tools that can automatically extract valuable chemical information from large volumes of unstructured data, such as the scientific literature. Since chemical information can be present in figures, tables, and textual paragraphs, successful information extraction often depends on the ability to interpret all of these domains simultaneously. We present a complete toolkit for the automated extraction of chemical entities and their associated properties, measurements, and relationships from scientific documents that can be used to populate structured chemical databases. Our system provides an extensible, chemistry-aware, natural language processing pipeline for tokenization, part-of-speech tagging, named entity recognition, and phrase parsing. Within this scope, we report improved performance for chemical named entity recognition through the use of unsupervised word clustering based on a massive corpus of chemistry articles. For phrase parsing and information extraction, we present the novel use of multiple rule-based grammars that are tailored for interpreting specific document domains such as textual paragraphs, captions, and tables. We also describe document-level processing to resolve data interdependencies and show that this is particularly necessary for the autogeneration of chemical databases since captions and tables commonly contain chemical identifiers and references that are defined elsewhere in the text. The performance of the toolkit to correctly extract various types of data was evaluated, affording an F-score of 93.4%, 86.8%, and 91.5% for extracting chemical identifiers, spectroscopic attributes, and chemical property attributes, respectively; set against the CHEMDNER chemical name extraction challenge, ChemDataExtractor yields a competitive F-score of 87.8%. All tools have been released under the MIT license and are available to download from http://www.chemdataextractor.org .

  5. The Idea and Concept of Metos3D: A Marine Ecosystem Toolkit for Optimization and Simulation in 3D

    CERN Document Server

    Piwonski, Jaroslaw

    2014-01-01

    The simulation and parameter optimization of coupled ocean circulation and ecosystem models in three space dimensions is one of the most challenging tasks in numerical climate research. Here we present a scientific toolkit that aims at supporting researchers by defining clear coupling interfaces, providing state-of-the-art numerical methods for simulation, parallelization and optimization while using only freely available and (to a great extend) platform-independent software. Besides defining a user-friendly coupling interface (API) for marine ecosystem or biogeochemical models, we heavily rely on the Portable, Extensible Toolkit for Scientific computation (PETSc) developed at Argonne Nat. Lab. for a wide variety of parallel linear and non-linear solvers and optimizers. We specifically focus on the usage of matrix-free Newton-Krylov methods for the fast computation of steady periodic solutions, and make use of the Transport Matrix Method (TMM) introduced by Khatiwala et al.

  6. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    Science.gov (United States)

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  7. Peer support for families of children with complex needs: Development and dissemination of a best practice toolkit.

    Science.gov (United States)

    Schippke, J; Provvidenza, C; Kingsnorth, S

    2017-11-01

    Benefits of peer support interventions for families of children with disabilities and complex medical needs have been described in the literature. An opportunity to create an evidence-informed resource to synthesize best practices in peer support for program providers was identified. The objective of this paper is to describe the key activities used to develop and disseminate the Peer Support Best Practice Toolkit. This project was led by a team of knowledge translation experts at a large pediatric rehabilitation hospital using a knowledge exchange framework. An integrated knowledge translation approach was used to engage stakeholders in the development process through focus groups and a working group. To capture best practices in peer support, a rapid evidence review and review of related resources were completed. Case studies were also included to showcase practice-based evidence. The toolkit is freely available online for download and is structured into four sections: (a) background and models of peer support, (b) case studies of programs, (c) resources, and (d) rapid evidence review. A communications plan was developed to disseminate the resource and generate awareness through presentations, social media, and champion engagement. Eight months postlaunch, the peer support website received more than 2,400 webpage hits. Early indicators suggest high relevance of this resource among stakeholders. The toolkit format was valuable to synthesize and share best practices in peer support. Strengths of the work include the integrated approach used to develop the toolkit and the inclusion of both the published research literature and experiential evidence. © 2017 John Wiley & Sons Ltd.

  8. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fields, Laura [Fermilab; Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-08-21

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. This raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.

  9. Increasing minerals industry by-product re-use through the application of a regional synergy toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Corder, G.D. [Queensland Univ., St. Lucia, (Australia); Bossilkov, A. [Curtin Univ. of Technology, Perth (Australia). Centre of Excellence in Cleaner Production

    2008-07-01

    Better use of resources and less waste in a given geographic region can be attributed to regional synergies or the re-use of industry by-products. In Australia, research at the Co-operative Research Centre for Sustainable Resource Planning (CSRP) has demonstrated that although the number of regional synergies around the world is increasing, a great deal have developed opportunistically, and not through a more rigorous and systematic design process. This paper outlined the main features of a toolkit, designed by CSRP to identify and prioritize synergy opportunities for new and existing industries within an industrial region. The paper also presented the results of the toolkit's application in several intense minerals processing regions. Specifically, the paper described the CSRP regional synergies research activities; the regional synergy opportunity tool; the preliminary assessment tool; inventory tools including the water tool, energy tool, and material by-product tool; and the screening tool. The benefits of the various tools were also discussed. It was concluded that the toolkit described in this paper provides a systematic identification and preliminary evaluation of synergy opportunities that can help industry develop more sustainable initiatives, as demonstrated through its testing and application in various minerals processing industrial regions. 12 refs., 3 tabs., 2 figs.

  10. Modeling Multiphase Coastal and Hydraulic Processes in an Interactive Python Environment with the Open Source Proteus Toolkit

    Science.gov (United States)

    Kees, C. E.; Farthing, M. W.; Ahmadia, A. J.; Bakhtyar, R.; Miller, C. T.

    2014-12-01

    Hydrology is dominated by multiphase flow processes, due to the importance of capturing water's interaction with soil and air phases. Unfortunately, many different mathematical model formulations are required to model particular processes and scales of interest, and each formulation often requires specialized numerical methods. The Proteus toolkit is a software package for research on models for coastal and hydraulic processes and improvements in numerics, particularly 3D multiphase processes and parallel numerics. The models considered include multiphase flow, shallow water flow, turbulent free surface flow, and various flow-driven processes. We will discuss the objectives of Proteus and recent evolution of the toolkit's design as well as present examples of how it has been used used to construct computational models of multiphase flows for the US Army Corps of Engineers. Proteus is also an open source toolkit authored primarily within the US Army Corps of Engineers, and used, developed, and maintained by a small community of researchers in both theoretical modeling and computational methods research. We will discuss how open source and community development practices have played a role in the creation of Proteus.

  11. Development of the ITHACA Toolkit for monitoring human rights and general health care in psychiatric and social care institutions.

    Science.gov (United States)

    Randall, J; Thornicroft, G; Burti, L; Katschnig, H; Lewis, O; Russo, J; Shaw, T; Wahlbeck, K; Rose, D

    2013-09-01

    Background. Human rights violations are commonly experienced by people in psychiatric and social care institutions. States and private organizations providing such health and social services must comply with international human rights law. Monitoring of such compliance is increasingly recognized as a vital component in ensuring that rights are respected and violations are brought out in the open, remedied and prevented. Aims. The Institutional Treatment, Human Rights and Care Assessment (ITHACA) project produced a method to document violations and good practice with the aim of preventing human rights violations and improving general health care practice in psychiatric and social care institutions (www.ithacastudy.eu). Methods. A methodological and implementation study conducted across 15 European countries developed and assessed the ITHACA Toolkit in monitoring visits to 87 mental health organizations. Results. The toolkit is available in 13 European languages and has demonstrated applicability in a range of contexts and conditions. The information gathered through monitoring visits can document both good practice and areas for improvement. Conclusions. The ITHACA Toolkit is an acceptable and feasible method for the systematic monitoring of human rights and general health care in psychiatric and social care institutions that explicitly calls for the participation of service users in the monitoring of human rights violations and general health care practice.

  12. Mobile-Assisted Vocabulary Learning: A Review Study

    Directory of Open Access Journals (Sweden)

    Parichehr Afzali

    2017-04-01

    Full Text Available Mobile phones are becoming more acceptable toolkits to learn languages. One aspect of English language which has been subject to investigation in mobile assisted language learning (MALL is vocabulary. This study reviewed some of the studies conducted in various contexts on the effect of MALL on vocabulary learning. We investigated some of the most prominent databases such as Science Direct, Wiley, Scopus and Oxford to find these studies; believing that this study can have pedagogical implications for future researchers and language teachers. We selected studies done in different countries such as Malaysia, Taiwan, Korea, China, Japan, Iran, Saudi Arabia, Turkey. Thirty studies were selected purposively in this way. Some of the main features of these studies are elaborated on in the discussion section. Pedagogical implications are discussed.

  13. The AAG's ALIGNED Toolkit: A Place-based Approach to Fostering Diversity in the Geosciences

    Science.gov (United States)

    Rodrigue, C. M.

    2012-12-01

    Where do we look to attract a more diverse group of students to academic programs in geography and the geosciences? What do we do once we find them? This presentation introduces the ALIGNED Toolkit developed by the Association of American Geographers, with funding from the NSF's Opportunities to Enhance Diversity in the Geosciences (OEDG) Program. ALIGNED (Addressing Locally-tailored Information Infrastructure and Geoscience Needs for Enhancing Diversity) seeks to align the needs of university departments and underrepresented students by drawing upon the intellectual wealth of geography and spatial science to provide better informed, knowledge-based action to enhance diversity in higher education and the geoscience workforce. The project seeks to inform and transform the ways in which departments and programs envision and realize their own goals to enhance diversity, promote inclusion, and broaden participation. We also seek to provide the data, information, knowledge, and best practices needed in order to enhance the recruitment and retention of underrepresented students. The ALIGNED Toolkit is currently in a beta release, available to 13 pilot departments and 50 testing departments of geography/geosciences. It consolidates a variety of data from departments, the U.S. Census Bureau, and the U.S. Department of Education's National Center for Education Statistics to provide interactive, GIS-based visualizations across multiple scales. It also incorporates a place-based, geographic perspective to support departments in their efforts to enhance diversity. A member of ALIGNED's senior personnel, who is also a representative of one of the pilot departments, will provide an overview and preview of the tool while sharing her department's experiences in progressing toward its diversity goals. A brief discussion on how geoscience departments might benefit from the ALIGNED approach and resources will follow. Undergraduate advisors, graduate program directors, department

  14. Prevention literacy: community-based advocacy for access and ownership of the HIV prevention toolkit.

    Science.gov (United States)

    Parker, Richard G; Perez-Brumer, Amaya; Garcia, Jonathan; Gavigan, Kelly; Ramirez, Ana; Milnor, Jack; Terto, Veriano

    2016-01-01

    Critical technological advances have yielded a toolkit of HIV prevention strategies. This literature review sought to provide contextual and historical reflection needed to bridge the conceptual gap between clinical efficacy and community effectiveness (i.e. knowledge and usage) of existing HIV prevention options, especially in resource-poor settings. Between January 2015 and October 2015, we reviewed scholarly and grey literatures to define treatment literacy and health literacy and assess the current need for literacy related to HIV prevention. The review included searches in electronic databases including MEDLINE, PsycINFO, PubMed, and Google Scholar. Permutations of the following search terms were used: "treatment literacy," "treatment education," "health literacy," and "prevention literacy." Through an iterative process of analyses and searches, titles and/or abstracts and reference lists of retrieved articles were reviewed for additional articles, and historical content analyses of grey literature and websites were additionally conducted. Treatment literacy was a well-established concept developed in the global South, which was later partially adopted by international agencies such as the World Health Organization. Treatment literacy emerged as more effective antiretroviral therapies became available. Developed from popular pedagogy and grassroots efforts during an intense struggle for treatment access, treatment literacy addressed the need to extend access to underserved communities and low-income settings that might otherwise be excluded from access. In contrast, prevention literacy is absent in the recent surge of new biomedical prevention strategies; prevention literacy was scarcely referenced and undertheorized in the available literature. Prevention efforts today include multimodal techniques, which jointly comprise a toolkit of biomedical, behavioural, and structural/environmental approaches. However, linkages to community advocacy and mobilization

  15. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST

    Directory of Open Access Journals (Sweden)

    Oliver Melvin J

    2005-04-01

    Full Text Available Abstract Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST, which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN. W.ND-BLAST provides intuitive Graphic User Interfaces (GUI for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is

  16. Dissemination of Earth Remote Sensing Data for Use in the NOAA/NWS Damage Assessment Toolkit

    Science.gov (United States)

    Molthan, Andrew; Burks, Jason; Camp, Parks; McGrath, Kevin; Bell, Jordan

    2015-01-01

    The National Weather Service has developed the Damage Assessment Toolkit (DAT), an application for smartphones and tablets that allows for the collection, geolocation, and aggregation of various damage indicators that are collected during storm surveys. The DAT supports the often labor-intensive process where meteorologists venture into the storm-affected area, allowing them to acquire geotagged photos of the observed damage while also assigning estimated EF-scale categories based upon their observations. Once the data are collected, the DAT infrastructure aggregates the observations into a server that allows other meteorologists to perform quality control and other analysis steps before completing their survey and making the resulting data available to the public. In addition to in-person observations, Earth remote sensing from operational, polar-orbiting satellites can support the damage assessment process by identifying portions of damage tracks that may be missed due to road limitations, access to private property, or time constraints. Products resulting from change detection techniques can identify damage to vegetation and the land surface, aiding in the survey process. In addition, higher resolution commercial imagery can corroborate ground-based surveys by examining higher-resolution commercial imagery. As part of an ongoing collaboration, NASA and NOAA are working to integrate near real-time Earth remote sensing observations into the NOAA/NWS Damage Assessment Toolkit. This presentation will highlight recent developments in a streamlined approach for disseminating Earth remote sensing data via web mapping services and a new menu interface that has been integrated within the DAT. A review of current and future products will be provided, including products derived from MODIS and VIIRS for preliminary track identification, along with conduits for higher-resolution Landsat, ASTER, and commercial imagery as they become available. In addition to tornado damage

  17. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    Science.gov (United States)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a

  18. Homelessness Assistance and Resources

    Science.gov (United States)

    ... Email Updates Need Housing Assistance? Home Homelessness Assistance Homelessness Assistance Programs CoC CoC Program Page NOFAs Laws, ... VI UT VT VA WA WV WI WY Homelessness Declines in Most Communities of the U.S. with ...

  19. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  20. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  1. A powerful toolkit for synthetic biology: Over 3.8 billion years of evolution.

    Science.gov (United States)

    Rothschild, Lynn J

    2010-04-01

    The combination of evolutionary with engineering principles will enhance synthetic biology. Conversely, synthetic biology has the potential to enrich evolutionary biology by explaining why some adaptive space is empty, on Earth or elsewhere. Synthetic biology, the design and construction of artificial biological systems, substitutes bio-engineering for evolution, which is seen as an obstacle. But because evolution has produced the complexity and diversity of life, it provides a proven toolkit of genetic materials and principles available to synthetic biology. Evolution operates on the population level, with the populations composed of unique individuals that are historical entities. The source of genetic novelty includes mutation, gene regulation, sex, symbiosis, and interspecies gene transfer. At a phenotypic level, variation derives from regulatory control, replication and diversification of components, compartmentalization, sexual selection and speciation, among others. Variation is limited by physical constraints such as diffusion, and chemical constraints such as reaction rates and membrane fluidity. While some of these tools of evolution are currently in use in synthetic biology, all ought to be examined for utility. A hybrid approach of synthetic biology coupled with fine-tuning through evolution is suggested.

  2. BioPig: a Hadoop-based analytic toolkit for large-scale sequence data.

    Science.gov (United States)

    Nordberg, Henrik; Bhatia, Karan; Wang, Kai; Wang, Zhong

    2013-12-01

    The recent revolution in sequencing technologies has led to an exponential growth of sequence data. As a result, most of the current bioinformatics tools become obsolete as they fail to scale with data. To tackle this 'data deluge', here we introduce the BioPig sequence analysis toolkit as one of the solutions that scale to data and computation. We built BioPig on the Apache's Hadoop MapReduce system and the Pig data flow language. Compared with traditional serial and MPI-based algorithms, BioPig has three major advantages: first, BioPig's programmability greatly reduces development time for parallel bioinformatics applications; second, testing BioPig with up to 500 Gb sequences demonstrates that it scales automatically with size of data; and finally, BioPig can be ported without modification on many Hadoop infrastructures, as tested with Magellan system at National Energy Research Scientific Computing Center and the Amazon Elastic Compute Cloud. In summary, BioPig represents a novel program framework with the potential to greatly accelerate data-intensive bioinformatics analysis.

  3. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  4. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    Science.gov (United States)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  5. A toolkit for GFP-mediated tissue-specific protein degradation in C. elegans.

    Science.gov (United States)

    Wang, Shaohe; Tang, Ngang Heok; Lara-Gonzalez, Pablo; Zhao, Zhiling; Cheerambathur, Dhanya K; Prevo, Bram; Chisholm, Andrew D; Desai, Arshad; Oegema, Karen

    2017-07-15

    Proteins that are essential for embryo production, cell division and early embryonic events are frequently reused later in embryogenesis, during organismal development or in the adult. Examining protein function across these different biological contexts requires tissue-specific perturbation. Here, we describe a method that uses expression of a fusion between a GFP-targeting nanobody and a SOCS-box containing ubiquitin ligase adaptor to target GFP-tagged proteins for degradation. When combined with endogenous locus GFP tagging by CRISPR-Cas9 or with rescue of a null mutant with a GFP fusion, this approach enables routine and efficient tissue-specific protein ablation. We show that this approach works in multiple tissues - the epidermis, intestine, body wall muscle, ciliated sensory neurons and touch receptor neurons - where it recapitulates expected loss-of-function mutant phenotypes. The transgene toolkit and the strain set described here will complement existing approaches to enable routine analysis of the tissue-specific roles of C. elegans proteins. © 2017. Published by The Company of Biologists Ltd.

  6. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  7. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy.

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A; Kapur, Tina; Wells, William M; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-02-11

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  8. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika

    2012-01-02

    The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  9. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    Directory of Open Access Journals (Sweden)

    Wei Shen

    Full Text Available FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  10. The CRISPR-Cas9 technology: Closer to the ultimate toolkit for targeted genome editing.

    Science.gov (United States)

    Quétier, Francis

    2016-01-01

    The first period of plant genome editing was based on Agrobacterium; chemical mutagenesis by EMS (ethyl methanesulfonate) and ionizing radiations; each of these technologies led to randomly distributed genome modifications. The second period is associated with the discoveries of homing and meganuclease enzymes during the 80s and 90s, which were then engineered to provide efficient tools for targeted editing. From 2006 to 2012, a few crop plants were successfully and precisely modified using zinc-finger nucleases. A third wave of improvement in genome editing, which led to a dramatic decrease in off-target events, was achieved in 2009-2011 with the TALEN technology. The latest revolution surfaced in 2013 with the CRISPR-Cas9 system, whose high efficiency and technical ease of use is really impressive; scientists can use in-house kits or commercially available kits; the only two requirements are to carefully choose the location of the DNA double strand breaks to be induced and then to order an oligonucleotide. While this close-to- ultimate toolkit for targeted editing of genomes represents dramatic scientific progress which allows the development of more complex useful agronomic traits through synthetic biology, the social acceptance of genome editing remains regularly questioned by anti-GMO citizens and organizations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  12. PODIO: An Event-Data-Model Toolkit for High Energy Physics Experiments

    Science.gov (United States)

    Gaede, F.; Hegner, B.; Mato, P.

    2017-10-01

    PODIO is a C++ library that supports the automatic creation of event data models (EDMs) and efficient I/O code for HEP experiments. It is developed as a new EDM Toolkit for future particle physics experiments in the context of the AIDA2020 EU programme. Experience from LHC and the linear collider community shows that existing solutions partly suffer from overly complex data models with deep object-hierarchies or unfavorable I/O performance. The PODIO project was created in order to address these problems. PODIO is based on the idea of employing plain-old-data (POD) data structures wherever possible, while avoiding deep object-hierarchies and virtual inheritance. At the same time it provides the necessary high-level interface towards the developer physicist, such as the support for inter-object relations and automatic memory-management, as well as a Python interface. To simplify the creation of efficient data models PODIO employs code generation from a simple yaml-based markup language. In addition, it was developed with concurrency in mind in order to support the use of modern CPU features, for example giving basic support for vectorization techniques.

  13. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    Science.gov (United States)

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. ImTK: an open source multi-center information management toolkit

    Science.gov (United States)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  15. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    Science.gov (United States)

    Shen, Wei; Le, Shuai; Li, Yan; Hu, Fuquan

    2016-01-01

    FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  16. XPAT: a toolkit to conduct cross-platform association studies with heterogeneous sequencing datasets.

    Science.gov (United States)

    Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle At; Yandell, Mark; Huff, Chad D

    2017-12-23

    High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. A Powerful Toolkit for Synthetic Biology: Over 3.8 Billion Years of Evolution

    Science.gov (United States)

    Rothschild, Lynn J.

    2010-01-01

    The combination of evolutionary with engineering principles will enhance synthetic biology. Conversely, synthetic biology has the potential to enrich evolutionary biology by explaining why some adaptive space is empty, on Earth or elsewhere. Synthetic biology, the design and construction of artificial biological systems, substitutes bio-engineering for evolution, which is seen as an obstacle. But because evolution has produced the complexity and diversity of life, it provides a proven toolkit of genetic materials and principles available to synthetic biology. Evolution operates on the population level, with the populations composed of unique individuals that are historical entities. The source of genetic novelty includes mutation, gene regulation, sex, symbiosis, and interspecies gene transfer. At a phenotypic level, variation derives from regulatory control, replication and diversification of components, compartmentalization, sexual selection and speciation, among others. Variation is limited by physical constraints such as diffusion, and chemical constraints such as reaction rates and membrane fluidity. While some of these tools of evolution are currently in use in synthetic biology, all ought to be examined for utility. A hybrid approach of synthetic biology coupled with fine-tuning through evolution is suggested

  19. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mahdipour, Seyed Ali [Physics Department, Hakim Sabzevari University, Sabzevar (Iran, Islamic Republic of); Mowlavi, Ali Asghar, E-mail: amowlavi@hsu.ac.ir [Physics Department, Hakim Sabzevari University, Sabzevar (Iran, Islamic Republic of); ICTP, Associate Federation Scheme, Medical Physics Field, Trieste (Italy)

    2016-07-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth−dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8 MeV proton, 190.1 MeV alpha, and 1060 MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam's Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors.

  20. YT: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Turk, Matthew J.; /San Diego, CASS; Smith, Britton D.; /Michigan State U.; Oishi, Jeffrey S.; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Skory, Stephen; Skillman, Samuel W.; /Colorado U., CASA; Abel, Tom; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Norman, Michael L.; /aff San Diego, CASS

    2011-06-23

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  1. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Directory of Open Access Journals (Sweden)

    Schuster André

    2012-01-01

    Full Text Available Abstract Background The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina, represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. Results Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414. Conclusions Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  2. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  3. A configuration space toolkit for automated spatial reasoning: Technical results and LDRD project final report

    Energy Technology Data Exchange (ETDEWEB)

    Xavier, P.G.; LaFarge, R.A.

    1997-02-01

    A robot`s configuration space (c-space) is the space of its kinematic degrees of freedom, e.g., the joint-space of an arm. Sets in c-space can be defined that characterize a variety of spatial relationships, such as contact between the robot and its environment. C-space techniques have been fundamental to research progress in areas such as motion planning and physically-based reasoning. However, practical progress has been slowed by the difficulty of implementing the c-space abstraction inside each application. For this reason, we proposed a Configuration Space Toolkit of high-performance algorithms and data structures meeting these needs. Our intent was to develop this robotics software to provide enabling technology to emerging applications that apply the c-space abstraction, such as advanced motion planning, teleoperation supervision, mechanism functional analysis, and design tools. This final report presents the research results and technical achievements of this LDRD project. Key results and achievements included (1) a hybrid Common LISP/C prototype that implements the basic C-Space abstraction, (2) a new, generic, algorithm for constructing hierarchical geometric representations, and (3) a C++ implementation of an algorithm for fast distance computation, interference detection, and c-space point-classification. Since the project conclusion, motion planning researchers in Sandia`s Intelligent Systems and Robotics Center have been using the CSTk libcstk.so C++ library. The code continues to be used, supported, and improved by projects in the ISRC.

  4. Book Review: Mac OS X, iPod, and iPhone Forensic Analysis DVD Toolkit

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2008-12-01

    Full Text Available Varsalone, J. (Tech. Ed., Kubasiak, R.R., Morrissey, S., et al. (2009. Mac OS X, iPod, and iPhone Forensic Analysis DVD Toolkit. Burlington, MA: Syngress. 551 + xix pages, ISBN: 978-1-59749-297-3, US$59.95.Reviewed by Gary C. Kessler (gary.kessler@champlain.eduAt last! A quality book about computer forensics for Apple products! Alas, I get ahead of myself.Apple's hold on the personal computer marketplace started dwindling on August 12, 1981, the day that the IBM PC was introduced. As an Apple ][+ bigot myself, I refused to touch a PC for some years. But I was also a command line bigot, so when the first Macintosh was introduced in 1983 and hermetically sealed the operating system from users, I did not go out and buy one. In fact, like many of my era, I did eventually end up on the PC side which, ironically, let me do many of the things that my trusty Apple ][+ had in earlier times -- write code, play with the hardware, and, indeed, get to a command line. And, of course, tons of application developers flocked to the PC because of its open architecture.(see PDF for full review

  5. Open Plot Project: an open-source toolkit for 3-D structural data analysis

    Directory of Open Access Journals (Sweden)

    S. Tavani

    2011-05-01

    Full Text Available In this work we present the Open Plot Project, an open-source software for structural data analysis, including a 3-D environment. The software includes many classical functionalities of structural data analysis tools, like stereoplot, contouring, tensorial regression, scatterplots, histograms and transect analysis. In addition, efficient filtering tools are present allowing the selection of data according to their attributes, including spatial distribution and orientation. This first alpha release represents a stand-alone toolkit for structural data analysis.

    The presence of a 3-D environment with digitalising tools allows the integration of structural data with information extracted from georeferenced images to produce structurally validated dip domains. This, coupled with many import/export facilities, allows easy incorporation of structural analyses in workflows for 3-D geological modelling. Accordingly, Open Plot Project also candidates as a structural add-on for 3-D geological modelling software.

    The software (for both Windows and Linux O.S., the User Manual, a set of example movies (complementary to the User Manual, and the source code are provided as Supplement. We intend the publication of the source code to set the foundation for free, public software that, hopefully, the structural geologists' community will use, modify, and implement. The creation of additional public controls/tools is strongly encouraged.

  6. WEB APPLICATION TO MANAGE DOCUMENTS USING THE GOOGLE WEB TOOLKIT AND APP ENGINE TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Velázquez Santana Eugenio César

    2017-12-01

    Full Text Available The application of new information technologies such as Google Web Toolkit and App Engine are making a difference in the academic management of Higher Education Institutions (IES, who seek to streamline their processes as well as reduce infrastructure costs. However, they encounter the problems with regard to acquisition costs, the infrastructure necessary for their use, as well as the maintenance of the software; It is for this reason that the present research aims to describe the application of these new technologies in HEIs, as well as to identify their advantages and disadvantages and the key success factors in their implementation. As a software development methodology, SCRUM was used as well as PMBOK as a project management tool. The main results were related to the application of these technologies in the development of customized software for teachers, students and administrators, as well as the weaknesses and strengths of using them in the cloud. On the other hand, it was also possible to describe the paradigm shift that data warehouses are generating with respect to today's relational databases.

  7. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  8. SLX4 Assembles a Telomere Maintenance Toolkit by Bridging Multiple Endonucleases with Telomeres

    Directory of Open Access Journals (Sweden)

    Bingbing Wan

    2013-09-01

    Full Text Available SLX4 interacts with several endonucleases to resolve structural barriers in DNA metabolism. SLX4 also interacts with telomeric protein TRF2 in human cells. The molecular mechanism of these interactions at telomeres remains unknown. Here, we report the crystal structure of the TRF2-binding motif of SLX4 (SLX4TBM in complex with the TRFH domain of TRF2 (TRF2TRFH and map the interactions of SLX4 with endonucleases SLX1, XPF, and MUS81. TRF2 recognizes a unique HxLxP motif on SLX4 via the peptide-binding site in its TRFH domain. Telomeric localization of SLX4 and associated nucleases depend on the SLX4-endonuclease and SLX4-TRF2 interactions and the protein levels of SLX4 and TRF2. SLX4 assembles an endonuclease toolkit that negatively regulates telomere length via SLX1-catalyzed nucleolytic resolution of telomere DNA structures. We propose that the SLX4-TRF2 complex serves as a double-layer scaffold bridging multiple endonucleases with telomeres for recombination-based telomere maintenance.

  9. Roofline model toolkit: A practical tool for architectural and program analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Yu Jung [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ligocki, Terry J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cordery, Matthew J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hall, Mary W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-18

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measure sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.

  10. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    Directory of Open Access Journals (Sweden)

    Tim Robertson

    Full Text Available The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1 encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2 publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements.

  11. In silico toxicology models and databases as FDA Critical Path Initiative toolkits

    Directory of Open Access Journals (Sweden)

    Valerio Luis G

    2011-03-01

    Full Text Available Abstract In silico toxicology methods are practical, evidence-based and high throughput, with varying accuracy. In silico approaches are of keen interest, not only to scientists in the private sector and to academic researchers worldwide, but also to the public. They are being increasingly evaluated and applied by regulators. Although there are foreseeable beneficial aspects -- including maximising use of prior test data and the potential for minimising animal use for future toxicity testing -- the primary use of in silico toxicology methods in the pharmaceutical sciences are as decision support information. It is possible for in silico toxicology methods to complement and strengthen the evidence for certain regulatory review processes, and to enhance risk management by supporting a more informed decision regarding priority setting for additional toxicological testing in research and product development. There are also several challenges with these continually evolving methods which clearly must be considered. This mini-review describes in silico methods that have been researched as Critical Path Initiative toolkits for predicting toxicities early in drug development based on prior knowledge derived from preclinical and clinical data at the US Food and Drug Administration, Center for Drug Evaluation and Research.

  12. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    Science.gov (United States)

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements.

  13. Development and application of a phylogenomic toolkit: Resolving the evolutionary history of Madagascar’s lemurs

    Science.gov (United States)

    Horvath, Julie E.; Weisrock, David W.; Embry, Stephanie L.; Fiorentino, Isabella; Balhoff, James P.; Kappeler, Peter; Wray, Gregory A.; Willard, Huntington F.; Yoder, Anne D.

    2008-01-01

    Lemurs and the other strepsirrhine primates are of great interest to the primate genomics community due to their phylogenetic placement as the sister lineage to all other primates. Previous attempts to resolve the phylogeny of lemurs employed limited mitochondrial or small nuclear data sets, with many relationships poorly supported or entirely unresolved. We used genomic resources to develop 11 novel markers from nine chromosomes, representing ∼9 kb of nuclear sequence data. In combination with previously published nuclear and mitochondrial loci, this yields a data set of more than 16 kb and adds ∼275 kb of DNA sequence to current databases. Our phylogenetic analyses confirm hypotheses of lemuriform monophyly and provide robust resolution of the phylogenetic relationships among the five lemuriform families. We verify that the genus Daubentonia is the sister lineage to all other lemurs. The Cheirogaleidae and Lepilemuridae are sister taxa and together form the sister lineage to the Indriidae; this clade is the sister lineage to the Lemuridae. Divergence time estimates indicate that lemurs are an ancient group, with their initial diversification occurring around the Cretaceous-Tertiary boundary. Given the power of this data set to resolve branches in a notoriously problematic area of primate phylogeny, we anticipate that our phylogenomic toolkit will be of value to other studies of primate phylogeny and diversification. Moreover, the methods applied will be broadly applicable to other taxonomic groups where phylogenetic relationships have been notoriously difficult to resolve. PMID:18245770

  14. A Multipurpose Toolkit to Enable Advanced Genome Engineering in Plants[OPEN

    Science.gov (United States)

    Gil-Humanes, Javier; Čegan, Radim; Kono, Thomas J.Y.; Konečná, Eva; Belanto, Joseph J.; Starker, Colby G.

    2017-01-01

    We report a comprehensive toolkit that enables targeted, specific modification of monocot and dicot genomes using a variety of genome engineering approaches. Our reagents, based on transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9 system, are systematized for fast, modular cloning and accommodate diverse regulatory sequences to drive reagent expression. Vectors are optimized to create either single or multiple gene knockouts and large chromosomal deletions. Moreover, integration of geminivirus-based vectors enables precise gene editing through homologous recombination. Regulation of transcription is also possible. A Web-based tool streamlines vector selection and construction. One advantage of our platform is the use of the Csy-type (CRISPR system yersinia) ribonuclease 4 (Csy4) and tRNA processing enzymes to simultaneously express multiple guide RNAs (gRNAs). For example, we demonstrate targeted deletions in up to six genes by expressing 12 gRNAs from a single transcript. Csy4 and tRNA expression systems are almost twice as effective in inducing mutations as gRNAs expressed from individual RNA polymerase III promoters. Mutagenesis can be further enhanced 2.5-fold by incorporating the Trex2 exonuclease. Finally, we demonstrate that Cas9 nickases induce gene targeting at frequencies comparable to native Cas9 when they are delivered on geminivirus replicons. The reagents have been successfully validated in tomato (Solanum lycopersicum), tobacco (Nicotiana tabacum), Medicago truncatula, wheat (Triticum aestivum), and barley (Hordeum vulgare). PMID:28522548

  15. Promoting palliative care in the community: production of the primary palliative care toolkit by the European Association of Palliative Care Taskforce in primary palliative care.

    Science.gov (United States)

    Murray, Scott A; Firth, Adam; Schneider, Nils; Van den Eynden, Bart; Gomez-Batiste, Xavier; Brogaard, Trine; Villanueva, Tiago; Abela, Jurgen; Eychmuller, Steffen; Mitchell, Geoffrey; Downing, Julia; Sallnow, Libby; van Rijswijk, Erik; Barnard, Alan; Lynch, Marie; Fogen, Frederic; Moine, Sébastien

    2015-02-01

    A multidisciplinary European Association of Palliative Care Taskforce was established to scope the extent of and learn what facilitates and hinders the development of palliative care in the community across Europe. To document the barriers and facilitators for palliative care in the community and to produce a resource toolkit that palliative care specialists, primary care health professionals or policymakers, service developers, educationalists and national groups more generally could use to facilitate the development of palliative care in their own country. (1) A survey instrument was sent to general practitioners with knowledge of palliative care services in the community in a diverse sample of European countries. We also conducted an international systematic review of tools used to identify people for palliative care in the community. (2) A draft toolkit was then constructed suggesting how individual countries might best address these issues, and an online survey was then set up for general practitioners and specialists to make comments. Iterations of the toolkit were then presented at international palliative care and primary care conferences. Being unable to identify appropriate patients for palliative care in the community was a major barrier internationally. The systematic review identified tools that might be used to help address this. Various facilitators such as national strategies were identified. A primary palliative care toolkit has been produced and refined, together with associated guidance. Many barriers and facilitators were identified. The primary palliative care toolkit can help community-based palliative care services to be established nationally. © The Author(s) 2014.

  16. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    Science.gov (United States)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  17. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    Energy Technology Data Exchange (ETDEWEB)

    Cirrone, G.A.P. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Cuttone, G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Di Rosa, F. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Raffaele, L. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Russo, G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Via S. Sofia 62, Catania (Italy); Guatelli, S. [Istituto Nazionale di Fisica Nucleare, Sezione di Genova, Via Dodecaneso 33, Genova (Italy); Pia, M.G. [Istituto Nazionale di Fisica Nucleare, Sezione di Genova, Via Dodecaneso 33, Genova (Italy)

    2006-01-15

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  18. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    Science.gov (United States)

    2011-01-01

    Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive

  19. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    Directory of Open Access Journals (Sweden)

    Cieślik Marcin

    2011-02-01

    Full Text Available Abstract Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'. A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption. An add-on module ('NuBio' facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures and functionality (e.g., to parse/write standard file formats. Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and

  20. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.

    Science.gov (United States)

    Cieślik, Marcin; Mura, Cameron

    2011-02-25

    Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage

  1. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    Science.gov (United States)

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  2. The e-Health Implementation Toolkit: Qualitative evaluation across four European countries

    LENUS (Irish Health Repository)

    MacFarlane, Anne

    2011-11-19

    Abstract Background Implementation researchers have attempted to overcome the research-practice gap in e-health by developing tools that summarize and synthesize research evidence of factors that impede or facilitate implementation of innovation in healthcare settings. The e-Health Implementation Toolkit (e-HIT) is an example of such a tool that was designed within the context of the United Kingdom National Health Service to promote implementation of e-health services. Its utility in international settings is unknown. Methods We conducted a qualitative evaluation of the e-HIT in use across four countries--Finland, Norway, Scotland, and Sweden. Data were generated using a combination of interview approaches (n = 22) to document e-HIT users\\' experiences of the tool to guide decision making about the selection of e-health pilot services and to monitor their progress over time. Results e-HIT users evaluated the tool positively in terms of its scope to organize and enhance their critical thinking about their implementation work and, importantly, to facilitate discussion between those involved in that work. It was easy to use in either its paper- or web-based format, and its visual elements were positively received. There were some minor criticisms of the e-HIT with some suggestions for content changes and comments about its design as a generic tool (rather than specific to sites and e-health services). However, overall, e-HIT users considered it to be a highly workable tool that they found useful, which they would use again, and which they would recommend to other e-health implementers. Conclusion The use of the e-HIT is feasible and acceptable in a range of international contexts by a range of professionals for a range of different e-health systems.

  3. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; White, K.; Burks, J.; Stellman, K.; Smith, M. R.

    2012-12-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post-Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post-event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS-capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellite-derived damage track information into the SDAT for near real-time use by forecasters

  4. The e-health implementation toolkit: qualitative evaluation across four European countries

    Directory of Open Access Journals (Sweden)

    MacFarlane Anne

    2011-11-01

    Full Text Available Abstract Background Implementation researchers have attempted to overcome the research-practice gap in e-health by developing tools that summarize and synthesize research evidence of factors that impede or facilitate implementation of innovation in healthcare settings. The e-Health Implementation Toolkit (e-HIT is an example of such a tool that was designed within the context of the United Kingdom National Health Service to promote implementation of e-health services. Its utility in international settings is unknown. Methods We conducted a qualitative evaluation of the e-HIT in use across four countries--Finland, Norway, Scotland, and Sweden. Data were generated using a combination of interview approaches (n = 22 to document e-HIT users' experiences of the tool to guide decision making about the selection of e-health pilot services and to monitor their progress over time. Results e-HIT users evaluated the tool positively in terms of its scope to organize and enhance their critical thinking about their implementation work and, importantly, to facilitate discussion between those involved in that work. It was easy to use in either its paper- or web-based format, and its visual elements were positively received. There were some minor criticisms of the e-HIT with some suggestions for content changes and comments about its design as a generic tool (rather than specific to sites and e-health services. However, overall, e-HIT users considered it to be a highly workable tool that they found useful, which they would use again, and which they would recommend to other e-health implementers. Conclusion The use of the e-HIT is feasible and acceptable in a range of international contexts by a range of professionals for a range of different e-health systems.

  5. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    Directory of Open Access Journals (Sweden)

    Stephanie Evans

    2017-02-01

    Full Text Available A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis, a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML. ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0, can be downloaded from http://www.york.ac.uk/ycil/software.

  6. The GEANT4 toolkit for microdosimetry calculations: application to microbeam radiation therapy (MRT).

    Science.gov (United States)

    Spiga, J; Siegbahn, E A; Bräuer-Krisch, E; Randaccio, P; Bravin, A

    2007-11-01

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the "valley" dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  7. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  8. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Science.gov (United States)

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  9. Improving the alcohol retail environment to reduce youth access: a randomized community trial of a best practices toolkit intervention.

    Science.gov (United States)

    Wolff, Lisa S; El Ayadi, Alison M; Lyons, Nancy J; Herr-Zaya, Kathleen; Noll, Debra; Perfas, Fernando; Rots, Gisela

    2011-06-01

    Underage alcohol use remains a significant public health problem throughout the United States and has important consequences for the health of individuals and communities. The objective of this study was to assess the impact of distributing an alcohol retailer toolkit via direct mail on increasing positive alcohol retailer attitudes towards checking IDs, encouraging retail managers to formalize ID checking procedures with their employees, and promoting consumers to be prepared to show ID when purchasing alcohol. This community randomized study included five matched Massachusetts community pairs. Our analysis sample consisted of 209 retailers (77 intervention; 132 control). In models adjusted for baseline response and matching community and establishment characteristics, intervention communities reported posting, on average, one additional sign or wall decal in their establishments (β = 0.937, P = 0.0069), and a twofold higher odds of handing out written materials on ID checking to staff (OR: 2.074, 95%CI: 1.003-4.288) compared to control establishments. However, the intervention was not found to have an effect on changing establishment policies, retailer attitudes, or other establishment practices. Intervention retailers perceived all components of the toolkit to be very useful for their establishments, and nearly all reported having shared materials with their employees and customers. These results suggest that some significant changes in alcohol retailer establishment practices can be achieved among motivated owners or managers through the distribution of a toolkit targeting best retailer practices. We do, however, recommend that future program planners consider alternative dissemination and marketing strategies beyond direct mail to encourage greater utilization.

  10. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Directory of Open Access Journals (Sweden)

    Kota Kasahara

    Full Text Available Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML, which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  11. Acceptability and utility of an innovative feeding toolkit to improve maternal and child dietary practices in Bihar, India.

    Science.gov (United States)

    Collison, Deborah Kortso; Kekre, Priya; Verma, Pankaj; Melgen, Sarah; Kram, Nidal; Colton, Jonathan; Blount, Wendy; Girard, Amy Webb

    2015-03-01

    Dietary practices in India often fail to provide adequate nutrition during the first 1,000 days of life. To explore the acceptability and utility of a low-cost and simple-to-use feeding toolkit consisting of a bowl with marks to indicate meal volume and frequency, a slotted spoon, and an illustrated counseling card to cue optimal dietary practices during the first 1,000 days. In Samastipur District, Bihar, India, we conducted 16 focus group discussions and 8 key informant interviews to determine community acceptability and obtain feedback on design and delivery of the feeding toolkit. We conducted 14 days of user testing with 20 pregnant women, 20 breastfeeding women 0 to 6 months postpartum, and 20 mothers with infants 6 to 18 months of age. The toolkit, which is made of plastic, was well accepted by the community, although the communities recommended manufacturing the bowl and spoon in steel. The proportion of pregnant and breast-feeding women taking an extra portion of food per day increased from 0% to 100%, and the number of meals taken per day increased from two or three to three or four. For children 6 to 18 months of age, meal frequency, quantity of food consumed during meals, and thickness of the foods increased for all age groups. Children 6 to 8 months of age who had not yet initiated complementary feeding all initiated complementary feeding during the testing period. Simple feeding tools are culturally acceptable and can be appropriately used by families in Bihar, India, to improve dietary practices during the first 1,000 days of life. Research is needed to assess whether the tools promote dietary and nutritional improvements over and above counseling alone.

  12. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Incerti, S., E-mail: sebastien.incerti@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Suerfu, B.; Xu, J. [Department of Physics, Princeton University, Princeton, NJ (United States); Ivantchenko, V. [Ecoanalytica, Moscow (Russian Federation); Geant4 Associates International Ltd, Hebden Bridge (United Kingdom); Mantero, A. [SWHARD srl, via Greto di Cornigliano 6r, 16152 Genova (Italy); Brown, J.M.C. [School of Mathematics and Physics, Queen’s University Belfast, Belfast, Northern Ireland (United Kingdom); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Francis, Z. [Université Saint Joseph, Faculty of Sciences, Department of Physics, Beirut (Lebanon); Karamitros, M. [Notre Dame Radiation Laboratory, University of Notre Dame, Notre Dame, IN (United States); Tran, H.N. [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam)

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  13. CONSTRUCTION AND ANALYSIS OF GPS ACCURACY FIELDS ON THE BASIS OF HARDWARE-SOFTWARE MEANS NI GPS SIMULATION TOOLKIT

    Directory of Open Access Journals (Sweden)

    O. N. Skrypnik

    2014-01-01

    Full Text Available Based on the application of hardware-software complex NI GPS Simulation Toolkit and aviation receiver CH-4312 proposed a method of constructing the fields of GPS precision according to the RAIM receiver (horizontal and vertical geometrical factor system data. By comparing with experimental data the reliability of the obtained results is assessed and limitations on the conditions of use of the complex for investigation of the characteristics of the GPS were found. Built fields precision GPS for two areas of airspace, located in mid-and high latitudes were constructed.

  14. WING/WORLD: An Open Experimental Toolkit for the Design and Deployment of IEEE 802.11-Based Wireless Mesh Networks Testbeds

    Directory of Open Access Journals (Sweden)

    Daniele Miorandi

    2010-01-01

    Full Text Available Wireless Mesh Networks represent an interesting instance of light-infrastructure wireless networks. Due to their flexibility and resiliency to network failures, wireless mesh networks are particularly suitable for incremental and rapid deployments of wireless access networks in both metropolitan and rural areas. This paper illustrates the design and development of an open toolkit aimed at supporting the design of different solutions for wireless mesh networking by enabling real evaluation, validation, and demonstration. The resulting testbed is based on off-the-shelf hardware components and open-source software and is focused on IEEE 802.11 commodity devices. The software toolkit is based on an “open” philosophy and aims at providing the scientific community with a tool for effective and reproducible performance analysis of WMNs. The paper describes the architecture of the toolkit, and its core functionalities, as well as its potential evolutions.

  15. The Python ARM Radar Toolkit (Py-ART, a Library for Working with Weather Radar Data in the Python Programming Language

    Directory of Open Access Journals (Sweden)

    Jonathan J Helmus

    2016-07-01

    Full Text Available The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. The source code for the toolkit is available on GitHub and is distributed under a BSD license.

  16. Evaluating the Usability and Perceived Impact of an Electronic Medical Record Toolkit for Atrial Fibrillation Management in Primary Care: A Mixed-Methods Study Incorporating Human Factors Design.

    Science.gov (United States)

    Tran, Kim; Leblanc, Kori; Valentinis, Alissia; Kavanagh, Doug; Zahr, Nina; Ivers, Noah M

    2016-02-17

    Atrial fibrillation (AF) is a common and preventable cause of stroke. Barriers to reducing stroke risk through appropriate prescribing have been identified at the system, provider, and patient levels. To ensure a multifaceted initiative to address these barriers is effective, it is essential to incorporate user-centered design to ensure all intervention components are optimized for users. To test the usability of an electronic medical record (EMR) toolkit for AF in primary care with the goal of further refining the intervention to meet the needs of primary care clinicians. An EMR-based toolkit for AF was created and optimized through usability testing and iterative redesign incorporating a human factors approach. A mixed-methods pilot study consisting of observations, semi-structured interviews, and surveys was conducted to examine usability and perceived impact on patient care and workflow. A total of 14 clinicians (13 family physicians and 1 nurse practitioner) participated in the study. Nine iterations of the toolkit were created in response to feedback from clinicians and the research team; interface-related changes were made, additional AF-related resources were added, and functionality issues were fixed to make the toolkit more effective. After improvements were made, clinicians expressed that the toolkit improved accessibility to AF-related information and resources, served as a reminder for guideline-concordant AF management, and was easy to use. Most clinicians intended to continue using the toolkit for patient care. With respect to impact on care, clinicians believed the toolkit increased the thoroughness of their assessments for patients with AF and improved the quality of AF-related care received by their patients. The positive feedback surrounding the EMR toolkit for AF and its perceived impact on patient care can be attributed to the adoption of a user-centered design that merged clinically important information about AF management with user needs

  17. La incineració funerària a Sardenya durant l'Edat de Bronze

    OpenAIRE

    Malgosa Morera, Assumpció

    2008-01-01

    Els últims estudis arqueològics indicaven que fins a l'arribada de la cultura fenícia al Mediterrani Occidental (s. IX a.C), la inhumació semblava ser l'única pràctica funerària durant la prehistòria de Sardenya. Ara, la història podria reescriure's degut a una recent excavació duta a terme a la Tomba IX de Sa Figu, prop de la ciutat italiana de Sassari. Mitjançant una anàlisi física-química de la combustió dels ossos trobats a la necròpoli s'ha demostrat que sí s'utilitzava el foc durant els...

  18. Adaptacions neuromusculars durant l'entrenament de força en homes de diferents edats

    OpenAIRE

    Izquierdo, Mikel; Aguado, Xavier

    1999-01-01

    L'entrenament progressiu de força màxima combinada amb exercicis de tipus explosiu indueix increment en la força màxima (isomètrica/dinamica i s'acompanya també amb augments consi-derables en la força explosiva dels músculs entrenats, no solament en les persones de 40 anys, sinó també en les de 70. L'increment de la força màxima s'explica només en part per l'augment en l'àrea de la secció transversal (AST), ja que l'activació voluntària dels músculs agonistes augmenta en major grau en les per...

  19. ORBiT: Oak Ridge Bio-surveillance Toolkit for Public Health Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Pullum, Laura L [ORNL; Hobson, Tanner C [ORNL; Steed, Chad A [ORNL; Chennubhotla, Chakra [University of Pittsburgh School of Medicine; Quinn, Shannon [University of Pittsburgh School of Medicine

    2015-01-01

    With novel emerging infectious diseases being reported across different parts of the world, there is a need to build effective bio-surveillance systems that can track, monitor and report such events in a timely manner. Apart from monitoring for emerging disease outbreaks, it is also important to identify susceptible geographic regions and populations where these diseases may have a significant impact. The digitization of health related information through electronic health records (EHR) and electronic healthcare claim reimbursements (eHCR) and the continued growth of self-reported health information through social media provides both tremendous opportunities and challenges in developing novel public health surveillance tools. In this paper, we present an overview of Oak Ridge Bio-surveillance Toolkit (ORBiT), which we have developed specifically to address data analytic challenges in the realm of public health surveillance. In particular, ORBiT provides an extensible environment to pull together diverse, large-scale datasets and analyze them to identify spatial and temporal patterns for various bio-surveillance related tasks. We demonstrate the utility of ORBiT in automatically extracting a small number of spatial and temporal patterns during the 2009-2010 pandemic H1N1 flu season using eHCR data. These patterns provide quantitative insights into the dynamics of how the pandemic flu spread across different parts of the country. We discovered that the eHCR data exhibits multi-scale patterns from which we could identify a small number of states in the United States (US) that act as bridge regions contributing to one or more specific influenza spread patterns. Similar to previous studies, the patterns show that the south-eastern regions of the US were widely affected by the H1N1 flu pandemic. Several of these south-eastern states act as bridge regions, which connect the north-east and central US in terms of flu occurrences. These quantitative insights show how the e

  20. ORBiT: Oak Ridge biosurveillance toolkit for public health dynamics.

    Science.gov (United States)

    Ramanathan, Arvind; Pullum, Laura L; Hobson, Tanner C; Steed, Chad A; Quinn, Shannon P; Chennubhotla, Chakra S; Valkova, Silvia

    2015-01-01

    The digitization of health-related information through electronic health records (EHR) and electronic healthcare reimbursement claims and the continued growth of self-reported health information through social media provides both tremendous opportunities and challenges in developing effective biosurveillance tools. With novel emerging infectious diseases being reported across different parts of the world, there is a need to build systems that can track, monitor and report such events in a timely manner. Further, it is also important to identify susceptible geographic regions and populations where emerging diseases may have a significant impact. In this paper, we present an overview of Oak Ridge Biosurveillance Toolkit (ORBiT), which we have developed specifically to address data analytic challenges in the realm of public health surveillance. In particular, ORBiT provides an extensible environment to pull together diverse, large-scale datasets and analyze them to identify spatial and temporal patterns for various biosurveillance-related tasks. We demonstrate the utility of ORBiT in automatically extracting a small number of spatial and temporal patterns during the 2009-2010 pandemic H1N1 flu season using claims data. These patterns provide quantitative insights into the dynamics of how the pandemic flu spread across different parts of the country. We discovered that the claims data exhibits multi-scale patterns from which we could identify a small number of states in the United States (US) that act as "bridge regions" contributing to one or more specific influenza spread patterns. Similar to previous studies, the patterns show that the south-eastern regions of the US were widely affected by the H1N1 flu pandemic. Several of these south-eastern states act as bridge regions, which connect the north-east and central US in terms of flu occurrences. These quantitative insights show how the claims data combined with novel analytical techniques can provide important

  1. A toolkit for incorporating genetics into mainstream medical services: Learning from service development pilots in England

    Directory of Open Access Journals (Sweden)

    Burton Hilary

    2010-05-01

    Full Text Available Abstract Background As advances in genetics are becoming increasingly relevant to mainstream healthcare, a major challenge is to ensure that these are integrated appropriately into mainstream medical services. In 2003, the Department of Health for England announced the availability of start-up funding for ten 'Mainstreaming Genetics' pilot services to develop models to achieve this. Methods Multiple methods were used to explore the pilots' experiences of incorporating genetics which might inform the development of new services in the future. A workshop with project staff, an email questionnaire, interviews and a thematic analysis of pilot final reports were carried out. Results Seven themes relating to the integration of genetics into mainstream medical services were identified: planning services to incorporate genetics; the involvement of genetics departments; the establishment of roles incorporating genetic activities; identifying and involving stakeholders; the challenges of working across specialty boundaries; working with multiple healthcare organisations; and the importance of cultural awareness of genetic conditions. Pilots found that the planning phase often included the need to raise awareness of genetic conditions and services and that early consideration of organisational issues such as clinic location was essential. The formal involvement of genetics departments was crucial to success; benefits included provision of clinical and educational support for staff in new roles. Recruitment and retention for new roles outside usual career pathways sometimes proved difficult. Differences in specialties' working practices and working with multiple healthcare organisations also brought challenges such as the 'genetic approach' of working with families, incompatible record systems and different approaches to health professionals' autonomous practice. 'Practice points' have been collated into a Toolkit which includes resources from the pilots

  2. A toolkit for incorporating genetics into mainstream medical services: Learning from service development pilots in England

    Science.gov (United States)

    2010-01-01

    Background As advances in genetics are becoming increasingly relevant to mainstream healthcare, a major challenge is to ensure that these are integrated appropriately into mainstream medical services. In 2003, the Department of Health for England announced the availability of start-up funding for ten 'Mainstreaming Genetics' pilot services to develop models to achieve this. Methods Multiple methods were used to explore the pilots' experiences of incorporating genetics which might inform the development of new services in the future. A workshop with project staff, an email questionnaire, interviews and a thematic analysis of pilot final reports were carried out. Results Seven themes relating to the integration of genetics into mainstream medical services were identified: planning services to incorporate genetics; the involvement of genetics departments; the establishment of roles incorporating genetic activities; identifying and involving stakeholders; the challenges of working across specialty boundaries; working with multiple healthcare organisations; and the importance of cultural awareness of genetic conditions. Pilots found that the planning phase often included the need to raise awareness of genetic conditions and services and that early consideration of organisational issues such as clinic location was essential. The formal involvement of genetics departments was crucial to success; benefits included provision of clinical and educational support for staff in new roles. Recruitment and retention for new roles outside usual career pathways sometimes proved difficult. Differences in specialties' working practices and working with multiple healthcare organisations also brought challenges such as the 'genetic approach' of working with families, incompatible record systems and different approaches to health professionals' autonomous practice. 'Practice points' have been collated into a Toolkit which includes resources from the pilots, including job descriptions and

  3. Raman spectroscopy based toolkit for mapping bacterial social interactions relevant to human and plant health

    Science.gov (United States)

    Couvillion, Sheha Polisetti

    Bacteria interact and co-exist with other microbes and with higher organisms like plants and humans, playing a major role in their health and well being. These ubiquitous single celled organisms are so successful, because they can form organized communities, called biofilms, that protect them from environmental stressors and enable communication and cooperation among members of the community. The work described in this thesis develops a toolkit of analytical techniques centered around Raman microspectroscopy and imaging representing a powerful approach to non-invasively investigate bacterial communities, yielding molecular information at the sub-micrometer length scale. Bacterial cellular components of non-pigmented and pigmented rhizosphere strains are characterized, and regiospecific SERS is used for cases where resonantly enhanced background signals obscure the spectra. Silver nanoparticle colloids were synthesized in situ, in the presence of the cells to form a proximal coating and principal component analysis (PCA) revealed features attributed to flavins. SERS enabled in situ acquisition of Raman spectra and chemical images in highly autofluorescent P.aeruginosa biofilms. In combination with PCA, this allowed for non-invasive spatial mapping of bacterial communities and revealed differences between strains and nutrients in the secretion of virulence factor pyocyanin. The rich potential of using Raman microspectroscopy to study plant-microbe interactions is demonstrated. Effect of exposure to oxidative stress, on both the wild type Pantoea sp. YR343 and carotenoid mutant Delta crtB, was assessed by following the intensity of the 1520 cm -1 and 1126 cm-1 Raman bands, respectively, after treatment with various concentrations of H2O2. Significant changes were observed in these marker bands even at concentrations (1 mM) below the point at which the traditional plate-based viability assay shows an effect (5-10 mM), thus establishing the value of Raman

  4. A toolkit for incorporating genetics into mainstream medical services: Learning from service development pilots in England.

    Science.gov (United States)

    Bennett, Catherine L; Burke, Sarah E; Burton, Hilary; Farndon, Peter A

    2010-05-14

    As advances in genetics are becoming increasingly relevant to mainstream healthcare, a major challenge is to ensure that these are integrated appropriately into mainstream medical services. In 2003, the Department of Health for England announced the availability of start-up funding for ten 'Mainstreaming Genetics' pilot services to develop models to achieve this. Multiple methods were used to explore the pilots' experiences of incorporating genetics which might inform the development of new services in the future. A workshop with project staff, an email questionnaire, interviews and a thematic analysis of pilot final reports were carried out. Seven themes relating to the integration of genetics into mainstream medical services were identified: planning services to incorporate genetics; the involvement of genetics departments; the establishment of roles incorporating genetic activities; identifying and involving stakeholders; the challenges of working across specialty boundaries; working with multiple healthcare organisations; and the importance of cultural awareness of genetic conditions. Pilots found that the planning phase often included the need to raise awareness of genetic conditions and services and that early consideration of organisational issues such as clinic location was essential. The formal involvement of genetics departments was crucial to success; benefits included provision of clinical and educational support for staff in new roles. Recruitment and retention for new roles outside usual career pathways sometimes proved difficult. Differences in specialties' working practices and working with multiple healthcare organisations also brought challenges such as the 'genetic approach' of working with families, incompatible record systems and different approaches to health professionals' autonomous practice. 'Practice points' have been collated into a Toolkit which includes resources from the pilots, including job descriptions and clinical tools. These can

  5. The Distributed Seismological Observatory: A Web Based Seismological Data Analysis and Distribution Toolkit

    Science.gov (United States)

    Govoni, A.; Lomax, A.; Michelini, A.

    The Web now provides a single, universal infrastructure for developing client/server data access applications and the seismology community can greatly benefit of this situation both for the routine observatory data analysis and for research purposes. The Web has reduced the myriad of platforms and technologies used to handle and exchange data to a single user interface (HTML), a single client platform (the Web browser), a single network protocol (HTTP), and a single server platform (the Web server). In this context we have designed a system that taking advantage of the latest devel- opment in the client/server data access technologies based on JAVA, JAVA RMI and XML may solve the most common problems in the data access and manipulation com- monly experienced in the seismological community. Key concepts in this design are thin client approach, minimum standards for data exchange and distributed computing. Thin client means that any PC with a JAVA enabled Web browser can interact with a set of remote data servers distributed in the world computer network querying for data and for services. Minimum standards relates to the language needed for client/server interaction that must be abstract enough to avoid that everybody know all the details of the transaction and this is solved by XML. Distribution means that a set of servers is able to provide to the client not only a data object (the actual data and the methods to work on it) but also the computing power to perform a particular task (a remote method in the JAVA RMI context) and limits the exchange of data to the results. This allows for client interaction also in very limited communication bandwidth situations. We describe in detail also the implementation of the main modules of the toolkit. A data eater module that gathers/archives seismological data from a variety of sources ranging from portable digitizers data to real-time network data. A picking/location server that allows for multi user Web based analysis of

  6. Toolkit for the Construction of Reproducing Kernel-Based Representations of Data: Application to Multidimensional Potential Energy Surfaces.

    Science.gov (United States)

    Unke, Oliver T; Meuwly, Markus

    2017-08-28

    In the early days of computation, slow processor speeds limited the amount of data that could be generated and used for scientific purposes. In the age of big data, the limiting factor usually is the method with which large amounts of data are analyzed and useful information is extracted. A typical example from chemistry are high-level ab initio calculations for small systems, which have nowadays become feasible even if energies at many different geometries are required. Molecular dynamics simulations often require several thousand distinct trajectories to be run. Under such circumstances suitable analytical representations of potential energy surfaces (PESs) based on ab initio calculations are required to propagate the dynamics at an acceptable cost. In this work we introduce a toolkit which allows the automatic construction of multidimensional PESs from gridded ab initio data based on reproducing kernel Hilbert space (RKHS) theory. The resulting representations require no tuning of parameters and allow energy and force evaluations at ab initio quality at the same cost as empirical force fields. Although the toolkit is primarily intended for constructing multidimensional potential energy surfaces for molecular systems, it can also be used for general machine learning purposes. The software is published under the MIT license and can be downloaded, modified, and used in other projects for free.

  7. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    Directory of Open Access Journals (Sweden)

    Lerendegui-Marco J.

    2017-01-01

    Full Text Available Monte Carlo (MC simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1, especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2 of the facility.

  8. A Teacher Tablet Toolkit to meet the challenges posed by 21st century rural teaching and learning environments

    Directory of Open Access Journals (Sweden)

    Adèle Botha

    2015-11-01

    Full Text Available This article draws upon the experiences gained in participating in an Information and Communication Technology for Rural Education (ICT4RED initiative, as part of a larger Technology for Rural Education project (TECH4RED in Cofimvaba in the Eastern Cape Province of South Africa. The aim of this paper is to describe the conceptualisation, design and application of an innovative teacher professional development course for rural teachers, enabling them to use tablets to support teaching and learning in their classrooms. The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design Science Research Process, to be incorporated in the ICT4RED Teacher Professional Development Course. The resulting course aims to equip participating teachers with a toolkit consisting of technology hardware, pragmatic pedagogical and technology knowledge and skills, and practice based experience. The significance of game design elements such as simulation and fun, technology in need rather than in case, adequate scaffolding and a clear learning path with interim learning goals are noted.

  9. ForeignAssistance.gov

    Data.gov (United States)

    US Agency for International Development — ForeignAssistance.gov provides a view of U.S. Government foreign assistance funds across agencies and enables users to explore, analyze, and review aid investments...

  10. Assisted Reproductive Technology (ART)

    Science.gov (United States)

    ... Cancel Close Email Share Dialog × Print Assisted Reproductive Technology (ART) ART refers to treatments and procedures that ... American Society for Reproductive Medicine. (2015). Assisted reproductive technologies: A guide for patients . Retrieved May 31, 2016, ...

  11. Estuche de evaluacion del continuo del desarrollo de El Curriculo Creativo (The Creative Curriculum Developmental Continuum Assessment Toolkit for Ages 3-5).

    Science.gov (United States)

    Dodge, Diane Trister; Colker, Laura J.; Heroman, Cate

    Intended for use with the Creative Curriculum for Early Childhood, this integrated ongoing student assessment toolkit, in Spanish, is designed for preschool teachers to help them focus on all aspects of a child's development, thereby giving them a way to ensure that all children in their classes are making progress. The assessment kit uses a…

  12. Professional Development: Learning from the Best. A Toolkit for Schools and Districts Based on the National Awards Program for Model Professional Development.

    Science.gov (United States)

    Hassel, Emily

    This publication provides a step-by-step guide to help schools and districts implement strong, sustainable professional development that drives achievement of student learning goals. The toolkit is based on the experiences of national professional development award winning schools and districts. The most common thread among the winners is that…

  13. The Python ARM Radar Toolkit (Py-ART), a Library for Working with Weather Radar Data in the Python Programming Language

    National Research Council Canada - National Science Library

    Helmus, Jonathan J; Collis, Scott M

    2016-01-01

    ... radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C...

  14. Plug-and-play paper-based toolkit for rapid prototyping of microfluidics and electronics towards point-of-care diagnostic solutions

    CSIR Research Space (South Africa)

    Smith, S

    2015-11-01

    Full Text Available -1 RAPDASA 2015 conference, Roodevallei, Pretoria, 4 - 6 November 2015 PLUG-AND-PLAY PAPER-BASED TOOLKIT FOR RAPID PROTOTYPING OF MICROFLUIDICS AND ELECTRONICS TOWARDS POINT-OF-CARE DIAGNOSTIC SOLUTIONS S. Smith1*, K. Moodley2 & K. Land3 1...

  15. The "Pesticides and Farmworker Health Toolkit" : An Innovative Model for Developing an Evidence-Informed Program for a Low-Literacy, Latino Immigrant Audience

    Science.gov (United States)

    LePrevost, Catherine E.; Storm, Julia F.; Asuaje, Cesar R.; Cope, W. Gregory

    2014-01-01

    Migrant and seasonal farmworkers are typically Spanish-speaking, Latino immigrants with limited formal education and low literacy skills and, as such, are a vulnerable population. We describe the development of the "Pesticides and Farmworker Health Toolkit", a pesticide safety and health curriculum designed to communicate to farmworkers…

  16. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    Science.gov (United States)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net

  17. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    Science.gov (United States)

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-06-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  18. Implementation of antimicrobial stewardship interventions recommended by national toolkits in primary and secondary healthcare sectors in England: TARGET and Start Smart Then Focus.

    Science.gov (United States)

    Ashiru-Oredope, D; Budd, E L; Bhattacharya, A; Din, N; McNulty, C A M; Micallef, C; Ladenheim, D; Beech, E; Murdan, S; Hopkins, S

    2016-05-01

    To assess and compare the implementation of antimicrobial stewardship (AMS) interventions recommended within the national AMS toolkits, TARGET and Start Smart Then Focus, in English primary and secondary healthcare settings in 2014, to determine the prevalence of cross-sector engagement to drive AMS interventions and to propose next steps to improve implementation of AMS. Electronic surveys were circulated to all 211 clinical commissioning groups (CCGs; primary sector) and to 146 (out of the 159) acute trusts (secondary sector) in England. Response rates were 39% and 63% for the primary and secondary sectors, respectively. The majority of CCGs and acute trusts reported reviewing national AMS toolkits formally or informally (60% and 87%, respectively). However, only 13% of CCGs and 46% of acute trusts had developed an action plan for the implementation of these toolkits. Only 5% of CCGs had antimicrobial pharmacists in post; however, the role of specialist antimicrobial pharmacists continued to remain embedded within acute trusts, with 83% of responding trusts having an antimicrobial pharmacist at a senior grade. The majority of healthcare organizations review national AMS toolkits; however, implementation of the toolkits, through the development of action plans to deliver AMS interventions, requires improvement. For the first time, we report the extent of cross-sector and multidisciplinary collaboration to deliver AMS interventions in both primary and secondary care sectors in England. Results highlight that further qualitative and quantitative work is required to explore mutual benefits and promote best practice. Antimicrobial pharmacists remain leaders for implementing AMS interventions across both primary and secondary healthcare sectors. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Development of a practical toolkit using participatory action research to address health inequalities through NGOs in the UK: Challenges and lessons learned.

    Science.gov (United States)

    Chirewa, Blessing

    2012-09-01

    This study aimed to develop a practical toolkit to support non-government organizations (NGOs) in tackling health inequalities in the UK and to highlight the challenges and lessons learned. A mixed qualitative methodology within an action research framework was conducted. Semi-structured questionnaires, focus group interviews and discussions with an expert reference group aimed to identify the important themes and produce the toolkit content. A practical guide of information materials for NGOs working on addressing health inequalities was subsequently developed and successfully piloted. The experience of using participatory action research revealed a number of lessons and challenges. The key challenges were lack of training and experience in conducting action research, costs and insufficient resources, slow and time-consuming process, lack of commitment from marginalized groups, and differences in emphasis of goals and vision among participants. The main lessons learned were importance of effective leadership and project management skills, importance of integrating researchers and the researched as equal partners, creation and nurturing of trust, importance of evaluating and piloting processes, importance of engaging with marginalized groups, and use of evidence base in decision making. The lessons and challenges enumerating herein are of value to researchers aiming to implement participatory action research in developing checklists, tools, practical guidance and frameworks, and they offer important areas to consider before starting such projects. In addition, this offers an insight into how the dynamics of participatory action research methodology evolved in the development of the toolkit. Future research and initiatives in this area should focus on ways to improve the toolkit and make it more relevant to a wider community, and methods for evaluating the impact of the toolkit on practice.

  20. MosSCI and gateway compatible plasmid toolkit for constitutive and inducible expression of transgenes in the C. elegans germline.

    Directory of Open Access Journals (Sweden)

    Eva Zeiser

    Full Text Available Here we describe a toolkit for the production of fluorescently tagged proteins in the C. elegans germline and early embryo using Mos1-mediated single copy insertion (MosSCI transformation. We have generated promoter and 3'UTR fusions to sequences of different fluorescent proteins yielding constructs for germline expression that are compatible with MosSCI MultiSite Gateway vectors. These vectors allow tagged transgene constructs to be inserted as single copies into known sites in the C. elegans genome using MosSCI. We also show that two C. elegans heat shock promoters (Phsp-16.2 and Phsp-16.41 can be used to induce transgene expression in the germline when inserted via MosSCI transformation. This flexible set of new vectors, available to the research community in a plasmid repository, should facilitate research focused on the C. elegans germline and early embryo.

  1. Design and validation of a three-instrument toolkit for the assessment of competence in electrocardiogram rhythm recognition.

    Science.gov (United States)

    Hernández-Padilla, José M; Granero-Molina, José; Márquez-Hernández, Verónica V; Suthers, Fiona; López-Entrambasaguas, Olga M; Fernández-Sola, Cayetano

    2017-06-01

    Rapid and accurate interpretation of cardiac arrhythmias by nurses has been linked with safe practice and positive patient outcomes. Although training in electrocardiogram rhythm recognition is part of most undergraduate nursing programmes, research continues to suggest that nurses and nursing students lack competence in recognising cardiac rhythms. In order to promote patient safety, nursing educators must develop valid and reliable assessment tools that allow the rigorous assessment of this competence before nursing students are allowed to practise without supervision. The aim of this study was to develop and psychometrically evaluate a toolkit to holistically assess competence in electrocardiogram rhythm recognition. Following a convenience sampling technique, 293 nursing students from a nursing faculty in a Spanish university were recruited for the study. The following three instruments were developed and psychometrically tested: an electrocardiogram knowledge assessment tool (ECG-KAT), an electrocardiogram skills assessment tool (ECG-SAT) and an electrocardiogram self-efficacy assessment tool (ECG-SES). Reliability and validity (content, criterion and construct) of these tools were meticulously examined. A high Cronbach's alpha coefficient demonstrated the excellent reliability of the instruments (ECG-KAT=0.89; ECG-SAT=0.93; ECG-SES=0.98). An excellent context validity index (scales' average content validity index>0.94) and very good criterion validity were evidenced for all the tools. Regarding construct validity, principal component analysis revealed that all items comprising the instruments contributed to measure knowledge, skills or self-efficacy in electrocardiogram rhythm recognition. Moreover, known-groups analysis showed the tools' ability to detect expected differences in competence between groups with different training experiences. The three-instrument toolkit developed showed excellent psychometric properties for measuring competence in

  2. Molecular diagnostic toolkit for Rhizophagus irregularis isolate DAOM-197198 using quantitative PCR assay targeting the mitochondrial genome.

    Science.gov (United States)

    Badri, Amine; Stefani, Franck O P; Lachance, Geneviève; Roy-Arcand, Line; Beaudet, Denis; Vialle, Agathe; Hijri, Mohamed

    2016-10-01

    Rhizophagus irregularis (previously named Glomus irregulare) is one of the most widespread and common arbuscular mycorrhizal fungal (AMF) species. It has been recovered worldwide in agricultural and natural soils, and the isolate DAOM-197198 has been utilized as a commercial inoculant for two decades. Despite the ecological and economical importance of this taxon, specific markers for quantification of propagules by quantitative real-time PCR (qPCR) are extremely limited and none have been rigorously validated for quality control of manufactured products such as biofertilizers. From the sequencing of 14 complete AMF mitochondrial (mt) genomes, a qPCR assay using a hydrolysis probe designed in the single copy cox3-rnl intergenic region was tested and validated to specifically and accurately quantify the spores of R. irregularis isolate DAOM-197198. Specificity tests were performed using standard PCR and qPCR, and results clearly showed that the primers specifically amplified the isolate DAOM-197198, yielding a PCR product of 106 bp. According to the qPCR analyses on spores produced in vitro, the average copy number of mt genomes per spore was 3172 ± 304 SE (n = 6). Quantification assays were successfully undertaken on known and unknown samples in liquid suspensions and commercial dry formulations to show the accuracy, precision, robustness, and reproducibility of the qPCR assay. This study provides a powerful molecular toolkit specifically designed to quantify spores of the model AMF isolate DAOM-197198. The approach of molecular toolkit used in our study could be applied to other AMF taxa and will be useful to research institutions and governmental and industrial laboratories running routine quality control of AMF-based products.

  3. A toolkit and robust pipeline for the generation of fosmid-based reporter genes in C. elegans.

    Directory of Open Access Journals (Sweden)

    Baris Tursun

    Full Text Available Engineering fluorescent proteins into large genomic clones, contained within BACs or fosmid vectors, is a tool to visualize and study spatiotemporal gene expression patterns in transgenic animals. Because these reporters cover large genomic regions, they most likely capture all cis-regulatory information and can therefore be expected to recapitulate all aspects of endogenous gene expression. Inserting tags at the target gene locus contained within genomic clones by homologous recombination ("recombineering" represents the most straightforward method to generate these reporters. In this methodology paper, we describe a simple and robust pipeline for recombineering of fosmids, which we apply to generate reporter constructs in the nematode C. elegans, whose genome is almost entirely covered in an available fosmid library. We have generated a toolkit that allows for insertion of fluorescent proteins (GFP, YFP, CFP, VENUS, mCherry and affinity tags at specific target sites within fosmid clones in a virtually seamless manner. Our new pipeline is less complex and, in our hands, works more robustly than previously described recombineering strategies to generate reporter fusions for C. elegans expression studies. Furthermore, our toolkit provides a novel recombineering cassette which inserts a SL2-spliced intercistronic region between the gene of interest and the fluorescent protein, thus creating a reporter controlled by all 5' and 3' cis-acting regulatory elements of the examined gene without the direct translational fusion between the two. With this configuration, the onset of expression and tissue specificity of secreted, sub-cellular compartmentalized or short-lived gene products can be easily detected. We describe other applications of fosmid recombineering as well. The simplicity, speed and robustness of the recombineering pipeline described here should prompt the routine use of this strategy for expression studies in C. elegans.

  4. A plasmid toolkit for cloning chimeric cDNAs encoding customized fusion proteins into any Gateway destination expression vector.

    Science.gov (United States)

    Buj, Raquel; Iglesias, Noa; Planas, Anna M; Santalucía, Tomàs

    2013-08-20

    Valuable clone collections encoding the complete ORFeomes for some model organisms have been constructed following the completion of their genome sequencing projects. These libraries are based on Gateway cloning technology, which facilitates the study of protein function by simplifying the subcloning of open reading frames (ORF) into any suitable destination vector. The expression of proteins of interest as fusions with functional modules is a frequent approach in their initial functional characterization. A limited number of Gateway destination expression vectors allow the construction of fusion proteins from ORFeome-derived sequences, but they are restricted to the possibilities offered by their inbuilt functional modules and their pre-defined model organism-specificity. Thus, the availability of cloning systems that overcome these limitations would be highly advantageous. We present a versatile cloning toolkit for constructing fully-customizable three-part fusion proteins based on the MultiSite Gateway cloning system. The fusion protein components are encoded in the three plasmids integral to the kit. These can recombine with any purposely-engineered destination vector that uses a heterologous promoter external to the Gateway cassette, leading to the in-frame cloning of an ORF of interest flanked by two functional modules. In contrast to previous systems, a third part becomes available for peptide-encoding as it no longer needs to contain a promoter, resulting in an increased number of possible fusion combinations. We have constructed the kit's component plasmids and demonstrate its functionality by providing proof-of-principle data on the expression of prototype fluorescent fusions in transiently-transfected cells. We have developed a toolkit for creating fusion proteins with customized N- and C-term modules from Gateway entry clones encoding ORFs of interest. Importantly, our method allows entry clones obtained from ORFeome collections to be used without prior

  5. Assistance Focus: Africa (Brochure)

    Energy Technology Data Exchange (ETDEWEB)

    2014-12-01

    The Clean Energy Solutions Center Ask an Expert service connects governments seeking policy information and advice with one of more than 30 global policy experts who can provide reliable and unbiased quick-response advice and information. The service is available at no cost to government agency representatives from any country and the technical institutes assisting them. This publication presents summaries of assistance provided to African governments, including the benefits of that assistance.

  6. Turning assistive machines into assistive robots

    Science.gov (United States)

    Argall, Brenna D.

    2015-01-01

    For decades, the potential for automation in particular, in the form of smart wheelchairs to aid those with motor, or cognitive, impairments has been recognized. It is a paradox that often the more severe a person's motor impairment, the more challenging it is for them to operate the very assistive machines which might enhance their quality of life. A primary aim of my lab is to address this confound by incorporating robotics autonomy and intelligence into assistive machines turning the machine into a kind of robot, and offloading some of the control burden from the user. Robots already synthetically sense, act in and reason about the world, and these technologies can be leveraged to help bridge the gap left by sensory, motor or cognitive impairments in the users of assistive machines. This paper overviews some of the ongoing projects in my lab, which strives to advance human ability through robotics autonomy.

  7. Assisted suicide and euthanasia.

    Science.gov (United States)

    van der Heide, Agnes

    2013-01-01

    Several countries have adopted laws that regulate physician assistance in dying. Such assistance may consist of providing a patient with a prescription of lethal medication that is self-administered by the patient, which is usually referred to as (physician) assistance in suicide, or of administering lethal medication to a patient, which is referred to as euthanasia. The main aim of regulating physician assistance in dying is to bring these practices into the open and to provide physicians with legal certainty. A key condition in all jurisdictions that have regulated either assistance in suicide or euthanasia is that physicians are only allowed to engage in these acts upon the explicit and voluntary request of the patient. All systems that allow physician assistance in dying have also in some way included the notion that physician assistance in dying is only accepted when it is the only means to address severe suffering from an incurable medical condition. Arguments against the legal regulation of physician assistance in dying include principled arguments, such as the wrongness of hastening death, and arguments that emphasize the negative consequences of allowing physician assistance in dying, such as a devaluation of the lives of older people, or people with chronic disease or disabilities. Opinion polls show that some form of accepting and regulating euthanasia and physician assistance in suicide is increasingly supported by the general population in most western countries. Studies in countries where physician assistance in dying is regulated suggest that practices have remained rather stable in most jurisdictions and that physicians adhere to the legal criteria in the vast majority of cases. © 2013 Elsevier B.V. All rights reserved.

  8. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, J [City College of New York, New York, NY (United States); Yuan, A; Li, G [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  9. A population-based randomized controlled trial of the effect of combining a pedometer with an intervention toolkit on physical activity among individuals with low levels of physical activity or fitness

    DEFF Research Database (Denmark)

    Petersen, Christina Bjørk; Severin, Maria; Hansen, Andreas Wolff

    2012-01-01

    To examine if receiving a pedometer along with an intervention toolkit is associated with increased physical activity, aerobic fitness and better self-rated health among individuals with low levels of physical activity or fitness....

  10. Training Teaching Assistants.

    Science.gov (United States)

    Rava, Susan

    1987-01-01

    Washington University's (Missouri) Department of Romance Languages and Literature requires its graduate teaching assistants to take a one-semester pedagogy course to ensure their competence and effectiveness as teaching assistants. The course features seminars in which goals, expectations, and learning theories are discussed and practice teaching…

  11. Agricultural Disaster Assistance

    Science.gov (United States)

    2005-08-29

    ineligible for crop insurance, and include mushrooms, floriculture , ornamental nursery, Christmas tree crops , turfgrass sod, aquaculture, and ginseng. Trees...federal crop insurance, the noninsured assistance program and emergency disaster loans. Since 1988, Congress regularly has made supplemental financial...assistance available to farmers and ranchers on an ad-hoc basis, primarily in the form of direct crop disaster payments and emergency livestock

  12. Federal disaster assistance programs

    Science.gov (United States)

    William J. Patterson

    1995-01-01

    The Robert T. Stafford Disaster Relief and Emergency Assistance Act—Public Law 93-288, as amended—is designed to provide support and assistance to citizens, state, and local government from catastrophic disasters and emergencies. The law provides support in three distinct phases, including preparedness in avoiding or minimizing the effect of a disaster, response...

  13. Service water assistance program

    Energy Technology Data Exchange (ETDEWEB)

    Munchausen, J.H. [EPRI Plant Support Engineering, Charlotte, NC (United States)

    1995-09-01

    The Service Water Assistance Program was developed to provide utility service water system engineers with a mechanism to quickly and efficiently address service water issues. Since its inception, its ability to assist utilities has resulted in a reduction in the operations and maintenance costs associated with service water systems and has provided a medium for EPRI awareness of industry service water issues.

  14. Assistive Technology (Moldinclud)

    OpenAIRE

    Luján Mora, Sergio

    2011-01-01

    Slides of the seminar "Assistive Technology" taught at the Institul de Stat de Instruire Continua (Chisinau, Moldova) in november 2011. Transparencias del seminario "Assistive Technology" impartido en el Institul de Stat de Instruire Continua (Chisinau, Moldavia) en noviembre 2011. Tempus project: MOLDINCLUD, Teaching Training Centre for Inclusive Education (TEMPUS 158980-2009).

  15. Social Assistance: Theoretical Underpinnings

    African Journals Online (AJOL)

    user

    Minas Hiruy. 90. Reflections: Social assistance: theoretical underpinnings. Minas Hiruy. 1. Key words: marginalized community, social assistance, social welfare, MDGs, development. The case of the marginalized and how society regarded or responded to the same has played a significant part in shaping human history ...

  16. iVAX: An integrated toolkit for the selection and optimization of antigens and the design of epitope-driven vaccines

    Science.gov (United States)

    Moise, Leonard; Gutierrez, Andres; Kibria, Farzana; Martin, Rebecca; Tassone, Ryan; Liu, Rui; Terry, Frances; Martin, Bill; De Groot, Anne S

    2015-01-01

    Computational vaccine design, also known as computational vaccinology, encompasses epitope mapping, antigen selection and immunogen design using computational tools. The iVAX toolkit is an integrated set of tools that has been in development since 1998 by De Groot and Martin. It comprises a suite of immunoinformatics algorithms for triaging candidate antigens, selecting immunogenic and conserved T cell epitopes, eliminating regulatory T cell epitopes, and optimizing antigens for immunogenicity and protection against disease. iVAX has been applied to vaccine development programs for emerging infectious diseases, cancer antigens and biodefense targets. Several iVAX vaccine design projects have had success in pre-clinical studies in animal models and are progressing toward clinical studies. The toolkit now incorporates a range of immunoinformatics tools for infectious disease and cancer immunotherapy vaccine design. This article will provide a guide to the iVAX approach to computational vaccinology. PMID:26155959

  17. Towards National eHealth Implementation--a comparative study on WHO/ITU National eHealth Strategy Toolkit in Iran.

    Science.gov (United States)

    Riazi, Hossein; Jafarpour, Maryam; Bitaraf, Ehsan

    2014-01-01

    Experiences has shown that utilization of ICT in health sector requires national commitment and planned efforts to make the best use of existing capacity. Establishing the main directions as well as planning the detailed steps needed are key to achieving longer-term goals such as health sector efficiency, reform or more fundamental transformation. Collaboration between the health and ICT sectors, both public and private, is central to this effort. As the major United Nations agencies for health and telecommunications respectively, the World Health Organization (WHO) and the International Telecommunication Union (ITU) have recognized the importance of collaboration for eHealth in their global resolutions, which encourage countries to develop national eHealth strategies; the National eHealth Strategy Toolkit is the proof of these recommendations. In this study a mapping of eHealth components in WHO/ITU National eHealth Strategy Toolkit and our national eHealth vision is presented.

  18. Renewable Energy Cost Modeling. A Toolkit for Establishing Cost-Based Incentives in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Gifford, Jason S. [Sustainable Energy Advantage, LLC, Framington, MA (United States); Grace, Robert C. [Sustainable Energy Advantage, LLC, Framington, MA (United States); Rickerson, Wilson H. [Meister Consultants Group, Inc., Boston, MA (United States)

    2011-05-01

    This report serves as a resource for policymakers who wish to learn more about levelized cost of energy (LCOE) calculations, including cost-based incentives. The report identifies key renewable energy cost modeling options, highlights the policy implications of choosing one approach over the other, and presents recommendations on the optimal characteristics of a model to calculate rates for cost-based incentives, FITs, or similar policies. These recommendations shaped the design of NREL's Cost of Renewable Energy Spreadsheet Tool (CREST), which is used by state policymakers, regulators, utilities, developers, and other stakeholders to assist with analyses of policy and renewable energy incentive payment structures. Authored by Jason S. Gifford and Robert C. Grace of Sustainable Energy Advantage LLC and Wilson H. Rickerson of Meister Consultants Group, Inc.

  19. FEMA Hazard Mitigation Assistance Flood Mitigation Assistance (FMA) Data

    Data.gov (United States)

    Department of Homeland Security — This dataset contains closed and obligated projects funded under the following Hazard Mitigation Assistance (HMA) grant programs: Flood Mitigation Assistance (FMA)....

  20. Applying a toolkit for dissemination and analysis of near real-time data through the World Wide Web: integration of the Antelope Real Time System, ROADNet, and PHP

    Science.gov (United States)

    Newman, R. L.; Lindquist, K. G.; Hansen, T. S.; Vernon, F. L.; Eakins, J.; Foley, S.; Orcutt, J.

    2005-12-01

    The ROADNet project has enabled the acquisition and storage of diverse data streams through seamless integration of the Antelope Real Time System (ARTS) with (for example) ecological, seismological and geodetic instrumentation. The robust system architecture allows researchers to simply network data loggers with relational databases; however, the ability to disseminate these data to policy makers, scientists and the general public has (until recently) been provided on an 'as needed' basis. The recent development of a Datascope interface to the popular open source scripting language PHP has provided an avenue for presenting near real time data (such as integers, images and movies) from within the ARTS framework easily on the World Wide Web. The interface also indirectly provided the means to transform data types into various formats using the extensive function libraries that accompany a PHP installation (such as image creation and manipulation, data encryption for sensitive information, and XML creation for structured document interchange through the World Wide Web). Using a combination of Datascope and PHP library functions, an extensible tool-kit is being developed to allow data managers to easily present their products on the World Wide Web. The tool-kit has been modeled after the pre-existing ARTS architecture to simplify the installation, development and ease-of-use for both the seasoned researcher and the casual user. The methodology and results of building the applications that comprise the tool-kit are the focus of this presentation, including procedural vs. object oriented design, incorporation of the tool-kit into the existing contributed software libraries, and case-studies of researchers who are employing the tools to present their data. http://anf.ucsd.edu