WorldWideScience

Sample records for toolkit enabling wider

  1. Local Safety Toolkit: Enabling safe communities of opportunity

    CSIR Research Space (South Africa)

    Holtmann, B

    2010-08-31

    Full Text Available remain inadequate to achieve safety. The Local Safety Toolkit supports a strategy for a Safe South Africa through the implementation of a model for a Safe Community of Opportunity. The model is the outcome of work undertaken over the course of the past...

  2. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  3. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  4. A New GPU-Enabled MODTRAN Thermal Model for the PLUME TRACKER Volcanic Emission Analysis Toolkit

    Science.gov (United States)

    Acharya, P. K.; Berk, A.; Guiang, C.; Kennett, R.; Perkins, T.; Realmuto, V. J.

    2013-12-01

    Real-time quantification of volcanic gaseous and particulate releases is important for (1) recognizing rapid increases in SO2 gaseous emissions which may signal an impending eruption; (2) characterizing ash clouds to enable safe and efficient commercial aviation; and (3) quantifying the impact of volcanic aerosols on climate forcing. The Jet Propulsion Laboratory (JPL) has developed state-of-the-art algorithms, embedded in their analyst-driven Plume Tracker toolkit, for performing SO2, NH3, and CH4 retrievals from remotely sensed multi-spectral Thermal InfraRed spectral imagery. While Plume Tracker provides accurate results, it typically requires extensive analyst time. A major bottleneck in this processing is the relatively slow but accurate FORTRAN-based MODTRAN atmospheric and plume radiance model, developed by Spectral Sciences, Inc. (SSI). To overcome this bottleneck, SSI in collaboration with JPL, is porting these slow thermal radiance algorithms onto massively parallel, relatively inexpensive and commercially-available GPUs. This paper discusses SSI's efforts to accelerate the MODTRAN thermal emission algorithms used by Plume Tracker. Specifically, we are developing a GPU implementation of the Curtis-Godson averaging and the Voigt in-band transmittances from near line center molecular absorption, which comprise the major computational bottleneck. The transmittance calculations were decomposed into separate functions, individually implemented as GPU kernels, and tested for accuracy and performance relative to the original CPU code. Speedup factors of 14 to 30× were realized for individual processing components on an NVIDIA GeForce GTX 295 graphics card with no loss of accuracy. Due to the separate host (CPU) and device (GPU) memory spaces, a redesign of the MODTRAN architecture was required to ensure efficient data transfer between host and device, and to facilitate high parallel throughput. Currently, we are incorporating the separate GPU kernels into a

  5. Enabling eHealth as a Pathway for Patient Engagement: a Toolkit for Medical Practice.

    Science.gov (United States)

    Graffigna, Guendalina; Barello, Serena; Triberti, Stefano; Wiederhold, Brenda K; Bosio, A Claudio; Riva, Giuseppe

    2014-01-01

    Academic and managerial interest in patient engagement is rapidly earning attention and becoming a necessary tool for researchers, clinicians and policymakers worldwide to manage the increasing burden of chronic conditions. The concept of patient engagement calls for a reframe of healthcare organizations' models and approaches to care. This also requires innovations in the direction of facilitating the exchanges between the patients and the healthcare. eHealth, namely the use of new communication technologies to provide healthcare, is proved to be proposable to innovate healthcare organizations and to improve exchanges between patients and health providers. However, little attention has been still devoted to how to best design eHealth tools in order to engage patients in their care. eHealth tools have to be appropriately designed according to the specific patients' unmet needs and priorities featuring the different phases of the engagement process. Basing on the Patient Engagement model and on the Positive Technology paradigm, we suggest a toolkit of phase-specific technological resources, highlighting their specific potentialities in fostering the patient engagement process.

  6. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  7. An Extended Design of the "Grid-Enabled SEE++ System" Based on Globus Toolkit 4 and gLite Conference

    CERN Document Server

    Schreiner, W.; Buchberger, M.; Kaltofen, T.

    2006-01-01

    "Grid-Enabled SEE++" based on the SEE++ software system for the biomechanical 3D simulation of the human eye and its muscles. SEE++ simulates the common eye muscle surgery techniques in a graphic interactive way that is familiar to an experienced surgeon. The goal of "Grid-Enabled SEE++" is to adapt and to extend SEE++ in several steps and to develop an efficient grid-based tool for "Evidence Based Medicine", which supports the surgeons in choosing optimal surgery techniques for the treatments of different syndromes of strabismus. In our previous work, we combined the SEE++ software with the Globus (pre-Web Service) middleware and developed a parallel version of the simulation of the "Hess-Lancaster test" (typical medical examination). By this, we demonstrated how a noticeable speedup can be achieved in SEE++ by the exploitation of the computational power of the Grid. Furthermore, we reported the prototype implementation of a medical database component for "Grid-Enabled SEE++". Finally, we designed a so calle...

  8. Tribal Green Building Toolkit

    Science.gov (United States)

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  9. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  10. Solar Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration National Dataset Toolkit Solar Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S . regional solar generation integration studies. It will provide modeled, coherent subhourly solar power data

  11. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  12. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  13. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  14. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  15. Geant4 - A Simulation Toolkit

    International Nuclear Information System (INIS)

    2002-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics

  16. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  17. Perl Template Toolkit

    CERN Document Server

    Chamberlain, Darren; Cross, David; Torkington, Nathan; Diaz, tatiana Apandi

    2004-01-01

    Among the many different approaches to "templating" with Perl--such as Embperl, Mason, HTML::Template, and hundreds of other lesser known systems--the Template Toolkit is widely recognized as one of the most versatile. Like other templating systems, the Template Toolkit allows programmers to embed Perl code and custom macros into HTML documents in order to create customized documents on the fly. But unlike the others, the Template Toolkit is as facile at producing HTML as it is at producing XML, PDF, or any other output format. And because it has its own simple templating language, templates

  18. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  19. Transportation librarian's toolkit

    Science.gov (United States)

    2007-12-01

    The Transportation Librarians Toolkit is a product of the Transportation Library Connectivity pooled fund study, TPF- 5(105), a collaborative, grass-roots effort by transportation libraries to enhance information accessibility and professional expert...

  20. An Overview of the Geant4 Toolkit

    CERN Document Server

    Apostolakis, John

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation trough matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. App...

  1. Context in a wider context

    Directory of Open Access Journals (Sweden)

    John Traxler

    2011-07-01

    Full Text Available This paper attempts to review and reconsider the role of context in mobile learning and starts by outlining definitions of context-aware mobile learning as the technologies have become more mature, more robust and more widely available and as the notion of context has become progressively richer. The future role of context-aware mobile learning is considered within the context of the future of mobile learning as it moves from the challenges and opportunities of pedagogy and technology to the challenges and opportunities of policy, scale, sustainability, equity and engagement with augmented reality, «blended learning», «learner devices», «user-generated contexts» and the «internet of things». This is essentially a perspective on mobile learning, and other forms of technology-enhanced learning (TEL, where educators and their institutions set the agenda and manage change. There are, however, other perspectives on context. The increasing availability and use of smart-phones and other personal mobile devices with similar powerful functionality means that the experience of context for many people, in the form of personalized or location-based services, is an increasingly social and informal experience, rather than a specialist or educational experience. This is part of the transformative impact of mobility and connectedness on our societies brought about by these universal, ubiquitous and pervasive technologies. This paper contributes a revised understanding of context in the wider context (sic of the transformations taking place in our societies. These are subtle but pervasive transformations of jobs, work and the economy, of our sense of time, space and place, of knowing and learning, and of community and identity. This leads to a radical reconsideration of context as the notions of ‹self› and ‹other› are transformed.

  2. Sealed radioactive sources toolkit

    International Nuclear Information System (INIS)

    Mac Kenzie, C.

    2005-09-01

    The IAEA has developed a Sealed Radioactive Sources Toolkit to provide information to key groups about the safety and security of sealed radioactive sources. The key groups addressed are officials in government agencies, medical users, industrial users and the scrap metal industry. The general public may also benefit from an understanding of the fundamentals of radiation safety

  3. An Industrial Physics Toolkit

    Science.gov (United States)

    Cummings, Bill

    2004-03-01

    Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.

  4. Fragment Impact Toolkit (FIT)

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel Wolf [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Key, Brian P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garcia, Daniel B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  5. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  6. The DLESE Evaluation Toolkit Project

    Science.gov (United States)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  7. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We compare hand-crafted custom code to polylithic and monolithic toolkit-based solutions. Polylithic toolkits follow a design philosophy similar to 3D scene graphs supported by toolkits including Java3D and OpenInventor...

  8. Viewpoint Reading Conference Recommendations in a Wider ...

    African Journals Online (AJOL)

    Viewpoint Reading Conference Recommendations in a Wider Context of Social Change. ... Southern African Journal of Environmental Education. Journal Home · ABOUT THIS JOURNAL ... AJOL African Journals Online. HOW TO USE AJOL.

  9. Terrain-Toolkit

    DEFF Research Database (Denmark)

    Wang, Qi; Kaul, Manohar; Long, Cheng

    2014-01-01

    , as will be shown, is used heavily for query processing in spatial databases; and (3) they do not provide the surface distance operator which is fundamental for many applications based on terrain data. Motivated by this, we developed a tool called Terrain-Toolkit for terrain data which accepts a comprehensive set......Terrain data is becoming increasingly popular both in industry and in academia. Many tools have been developed for visualizing terrain data. However, we find that (1) they usually accept very few data formats of terrain data only; (2) they do not support terrain simplification well which...

  10. The Populist Toolkit

    OpenAIRE

    Ylä-Anttila, Tuukka Salu Santeri

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  11. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.

  12. An Overview of the GEANT4 Toolkit

    International Nuclear Information System (INIS)

    Apostolakis, John; CERN; Wright, Dennis H.

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments

  13. Application experiences with the Globus toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  14. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  15. The Knowledge Translation Toolkit: Bridging the Know–Do Gap: A ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-06-06

    Jun 6, 2011 ... It presents the theories, tools, and strategies required to encourage and enable ... Toolkit: Bridging the Know–Do Gap: A Resource for Researchers ... violence, and make digital platforms work for inclusive development.

  16. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  17. The Lean and Environment Toolkit

    Science.gov (United States)

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  18. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  19. Baffles Promote Wider, Thinner Silicon Ribbons

    Science.gov (United States)

    Seidensticker, Raymond G.; Mchugh, James P.; Hundal, Rolv; Sprecace, Richard P.

    1989-01-01

    Set of baffles just below exit duct of silicon-ribbon-growing furnace reduces thermal stresses in ribbons so wider ribbons grown. Productivity of furnace increased. Diverts plume of hot gas from ribbon and allows cooler gas from top of furnace to flow around. Also shields ribbon from thermal radiation from hot growth assembly. Ribbon cooled to lower temperature before reaching cooler exit duct, avoiding abrupt drop in temperature as entering duct.

  20. BIT: Biosignal Igniter Toolkit.

    Science.gov (United States)

    da Silva, Hugo Plácido; Lourenço, André; Fred, Ana; Martins, Raúl

    2014-06-01

    The study of biosignals has had a transforming role in multiple aspects of our society, which go well beyond the health sciences domains to which they were traditionally associated with. While biomedical engineering is a classical discipline where the topic is amply covered, today biosignals are a matter of interest for students, researchers and hobbyists in areas including computer science, informatics, electrical engineering, among others. Regardless of the context, the use of biosignals in experimental activities and practical projects is heavily bounded by the cost, and limited access to adequate support materials. In this paper we present an accessible, albeit versatile toolkit, composed of low-cost hardware and software, which was created to reinforce the engagement of different people in the field of biosignals. The hardware consists of a modular wireless biosignal acquisition system that can be used to support classroom activities, interface with other devices, or perform rapid prototyping of end-user applications. The software comprehends a set of programming APIs, a biosignal processing toolbox, and a framework for real time data acquisition and postprocessing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Toolkit Design for Interactive Structured Graphics

    National Research Council Canada - National Science Library

    Bederson, Benjamin B; Grosjean, Jesse; Meyer, Jon

    2003-01-01

    .... We describe Jazz (a polylithic toolkit) and Piccolo (a monolithic toolkit), each of which we built to support interactive 2D structured graphics applications in general, and Zoomable User Interface applications in particular...

  2. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  3. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  4. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  5. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  6. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  7. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  8. A Wider Look at Visual Discomfort

    Directory of Open Access Journals (Sweden)

    L O'Hare

    2012-07-01

    Full Text Available Visual discomfort is the adverse effects reported by some on viewing certain stimuli, such as stripes and certain filtered noise patterns. Stimuli that deviate from natural image statistics might be encoded inefficiently, which could cause discomfort (Juricevic, Land, Wilkins and Webster, 2010, Perception, 39(7, 884–899, possibly through excessive cortical responses (Wilkins, 1995, Visual Stress, Oxford, Oxford University Press. A less efficient visual system might exacerbate the effects of difficult stimuli. Extreme examples are seen in epilepsy and migraines (Wilkins, Bonnanni, Prociatti, Guerrini, 2004, Epilepsia, 45, 1–7; Aurora and Wilkinson, 2007, Cephalalgia, 27(12, 1422–1435. However, similar stimuli are also seen as uncomfortable by non-clinical populations, eg, striped patterns (Wilkins et al, 1984, Brain, 107(4. We propose that oversensitivity of clinical populations may represent extreme examples of visual discomfort in the general population. To study the prevalence and impact of visual discomfort in a wider context than typically studied, an Internet-based survey was conducted, including standardised questionnaires measuring visual discomfort susceptibility (Conlon, Lovegrove, Chekaluk and Pattison, 1999, Visual Cognition, 6(6, 637–663; Evans and Stevenson, 2008, Ophthal Physiol Opt 28(4 295–309 and judgments of visual stimuli, such as striped patterns (Wilkins et al, 1984 and filtered noise patterns (Fernandez and Wilkins, 2008, Perception, 37(7 1098–1013. Results show few individuals reporting high visual discomfort, contrary to other researchers (eg, Conlon et al, 1999.

  9. Srijan: a graphical toolkit for sensor network macroprogramming

    OpenAIRE

    Pathak , Animesh; Gowda , Mahanth K.

    2009-01-01

    International audience; Macroprogramming is an application development technique for wireless sensor networks (WSNs) where the developer specifies the behavior of the system, as opposed to that of the constituent nodes. In this proposed demonstration, we would like to present Srijan, a toolkit that enables application development for WSNs in a graphical manner using data-driven macroprogramming. It can be used in various stages of application development, viz. i) specification of application ...

  10. A Multipurpose Toolkit to Enable Advanced Genome Engineering in Plants

    Czech Academy of Sciences Publication Activity Database

    Čermák, Tomáš; Curtin, S.J.; Gil-Humanes, J.; Čegan, Radim; Kono, T.J.Y.; Konecna, E.; Belanto, J.J.; Starker, C.G.; Mathre, J.W.; Greenstein, R.L.; Voytas, D.F.

    2017-01-01

    Roč. 29, č. 6 (2017), s. 1196-1217 ISSN 1040-4651 Institutional support: RVO:68081707 Keywords : crispr /cas9-mediated targeted mutagenesis * zinc-finger nucleases * panicum-virgatum l. Subject RIV: CE - Biochemistry OBOR OECD: Biochemistry and molecular biology Impact factor: 8.688, year: 2016

  11. BAT - The Bayesian Analysis Toolkit

    CERN Document Server

    Caldwell, Allen C; Kröninger, Kevin

    2009-01-01

    We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner. A goodness-of-fit criterion is presented which is intuitive and of great practical use.

  12. A software toolkit for implementing low-cost virtual reality training systems

    International Nuclear Information System (INIS)

    Louka, Michael N.

    1999-04-01

    VR is a powerful technology for implementing training systems but better tools are needed to achieve wider usage and acceptance for desktop computer-based training applications. A need has been identified for a software tool kit to support the efficient implementation of well-structured desktop VR training systems. A powerful toolkit for implementing scalable low-cost VR training applications is described in this report (author) (ml)

  13. Supporting LGBT Communities: Police ToolKit

    OpenAIRE

    Vasquez del Aguila, Ernesto; Franey, Paul

    2013-01-01

    This toolkit provides police forces with practical educational tools, which can be used as part of a comprehensive LGBT strategy centred on diversity, equality, and non-discrimination. These materials are based on lessons learned through real life policing experiences with LGBT persons. The Toolkit is divided into seven scenarios where police awareness of LGBT issues has been identified as important. The toolkit employs a practical, scenario-based, problem-solving approach to help police offi...

  14. Extending IPY Data to a Wider Audience

    Science.gov (United States)

    Turrin, M.; Bell, R. E.; Pfirman, S. L.

    2010-12-01

    Perhaps the most significant IPY contribution to science education was the vast amount of data collected at the polar-regions on Earth systems and processes that was made immediately available to teachers and curriculum developers. Supplementing textbooks with the Internet as an education partner, allowed participating teachers to transform science education through: their use of current data as an integral component of their classroom teaching; their training of students to seek out data as evidence of Earth processes; and their instruction to students on how to validate sources and uses of data. Yet, for every teacher and student who has been part of this successful IPY outreach there are many more who have not been reached, don’t know how to include polar science into their coursework, or don’t comfortably work with data. Our experience with data education projects suggests that to reach the next round of students, teachers, educators and the wider adult population we need to translate this data so it is accessible through carefully constructed activities, simulations, and games. In addition we need to actively seek new partnership and outlet opportunities. The collected measurements tell us that our poles are warming on a human timescale. Using data to tell the story, the unambiguous signal of warming makes it accessible to a much broader audience. Our experience has shown that, for a novice population working with data, the educational effectiveness is significantly enhanced when the signal in the data is strong and the Earth processes are clear. Building upon IPY data and resources, focusing on the Earth’s changing climate, and working with partnerships developed over the last two years, Lamont has put together several new education and outreach collaborations. Our goal is to reach new audiences through: 1) Inventorying, Assessing and Planning - Through an NSF planning grant we are leveraging IPY connections and findings in a Polar Climate Education

  15. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  16. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  17. Google Web Toolkit for Ajax

    CERN Document Server

    Perry, Bruce

    2007-01-01

    The Google Web Toolkit (GWT) is a nifty framework that Java programmers can use to create Ajax applications. The GWT allows you to create an Ajax application in your favorite IDE, such as IntelliJ IDEA or Eclipse, using paradigms and mechanisms similar to programming a Java Swing application. After you code the application in Java, the GWT's tools generate the JavaScript code the application needs. You can also use typical Java project tools such as JUnit and Ant when creating GWT applications. The GWT is a free download, and you can freely distribute the client- and server-side code you c

  18. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  19. The Sense-It App: A Smartphone Sensor Toolkit for Citizen Inquiry Learning

    Science.gov (United States)

    Sharples, Mike; Aristeidou, Maria; Villasclaras-Fernández, Eloy; Herodotou, Christothea; Scanlon, Eileen

    2017-01-01

    The authors describe the design and formative evaluation of a sensor toolkit for Android smartphones and tablets that supports inquiry-based science learning. The Sense-it app enables a user to access all the motion, environmental and position sensors available on a device, linking these to a website for shared crowd-sourced investigations. The…

  20. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  1. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  2. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  3. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims to communi......The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...... to communicate best practices in conserving biodiversity and sustaining ecosystem services to potential users and to promote the wise-use of aquatic resources, improve livelihoods and enhance policy information....

  4. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  5. SIGKit: Software for Introductory Geophysics Toolkit

    Science.gov (United States)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  6. Java advanced medical image toolkit

    International Nuclear Information System (INIS)

    Saunder, T.H.C.; O'Keefe, G.J.; Scott, A.M.

    2002-01-01

    Full text: The Java Advanced Medical Image Toolkit (jAMIT) has been developed at the Center for PET and Department of Nuclear Medicine in an effort to provide a suite of tools that can be utilised in applications required to perform analysis, processing and visualisation of medical images. jAMIT uses Java Advanced Imaging (JAI) to combine the platform independent nature of Java with the speed benefits associated with native code. The object-orientated nature of Java allows the production of an extensible and robust package which is easily maintained. In addition to jAMIT, a Medical Image VO API called Sushi has been developed to provide access to many commonly used image formats. These include DICOM, Analyze, MINC/NetCDF, Trionix, Beat 6.4, Interfile 3.2/3.3 and Odyssey. This allows jAMIT to access data and study information contained in different medical image formats transparently. Additional formats can be added at any time without any modification to the jAMIT package. Tools available in jAMIT include 2D ROI Analysis, Palette Thresholding, Image Groping, Image Transposition, Scaling, Maximum Intensity Projection, Image Fusion, Image Annotation and Format Conversion. Future tools may include 2D Linear and Non-linear Registration, PET SUV Calculation, 3D Rendering and 3D ROI Analysis. Applications currently using JAMIT include Antibody Dosimetry Analysis, Mean Hemispheric Blood Flow Analysis, QuickViewing of PET Studies for Clinical Training, Pharamcodynamic Modelling based on Planar Imaging, and Medical Image Format Conversion. The use of jAMIT and Sushi for scripting and analysis in Matlab v6.1 and Jython is currently being explored. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  7. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  8. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  9. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  10. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  11. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  12. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    Science.gov (United States)

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the

  13. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  14. Measuring employee satisfaction in new offices - the WODI toolkit

    NARCIS (Netherlands)

    Maarleveld, M.; Volker, L.; van der Voordt, Theo

    2009-01-01

    Purpose: This paper presents a toolkit to measure employee satisfaction and perceived labour productivity as affected by different workplace strategies. The toolkit is being illustrated by a case study of the Dutch Revenue Service.
    Methodology: The toolkit has been developed by a review of

  15. Wind Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov (United States)

    Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and Western Wind Integration Data Set. It supports the next generation of wind integration studies. WIND

  16. Assessing the wider environmental value of remediating land contamination

    NARCIS (Netherlands)

    Bardos, R.P.; Kearney, T.E.; Nathanail, C.P.; Weenk, A.; Martin, I.D.

    2000-01-01

    The aim of this paper is to consider qualitative and quantitative approaches for assessing the wider environmental value of remediating land contamination. In terms of the environmental element of sustainable development, a remediation project's overall environmental performance is the sum of the

  17. Strengthening Coastal Pollution Management in the Wider Caribbean Region

    NARCIS (Netherlands)

    Lavieren, van H.; Metcalfe, C.D.; Drouillard, K.; Sale, P.; Gold-Bouchot, G.; Reid, R.; Vermeulen, L.C.

    2011-01-01

    Control of aquatic pollution is critical for improving coastal zone management and for the conservation of fisheries resources. Countries in the Wider Caribbean Region (WCR) generally lack monitoring capacity and do not have reliable information on the levels and distribution of pollutants,

  18. Wider Opportunities for Women Nontraditional Work Programs: A Guide.

    Science.gov (United States)

    Wider Opportunities for Women, Inc., Washington, DC.

    Since 1970, Wider Opportunities for Women (WOW), in Washington, D.C., has conducted programs to train and place disadvantaged women in nontraditional jobs. The results have been record-breaking: high placement rates, high job retention rates, good starting salaries, and upward mobility for women who seemed doomed to a life of poverty and…

  19. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  20. Integrated System Health Management Development Toolkit

    Science.gov (United States)

    Figueroa, Jorge; Smith, Harvey; Morris, Jon

    2009-01-01

    This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.

  1. A toolkit for promoting healthy ageing

    NARCIS (Netherlands)

    Jeroen Knevel; Aly Gruppen

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience,

  2. A Toolkit of Systems Gaming Techniques

    Science.gov (United States)

    Finnigan, David; McCaughey, Jamie W.

    2017-04-01

    Decision-makers facing natural hazard crises need a broad set of cognitive tools to help them grapply with complexity. Systems gaming can act as a kind of 'flight simulator for decision making' enabling us to step through real life complex scenarios of the kind that beset us in natural disaster situations. Australian science-theatre ensemble Boho Interactive is collaborating with the Earth Observatory Singapore to develop an in-person systems game modelling an unfolding natural hazard crisis (volcanic unrest or an approaching typhoon) impacting an Asian city. Through a combination of interactive mechanisms drawn from boardgaming and participatory theatre, players will make decisions and assign resources in response to the unfolding crisis. In this performance, David Finnigan from Boho will illustrate some of the participatory techniques that Boho use to illustrate key concepts from complex systems science. These activities are part of a toolkit which can be adapted to fit a range of different contexts and scenarios. In this session, David will present short activities that demonstrate a range of systems principles including common-pool resource challenges (the Tragedy of the Commons), interconnectivity, unintended consequences, tipping points and phase transitions, and resilience. The interactive mechanisms for these games are all deliberately lo-fi rather than digital, for three reasons. First, the experience of a tactile, hands-on game is more immediate and engaging. It brings the focus of the participants into the room and facilitates engagement with the concepts and with each other, rather than with individual devices. Second, the mechanics of the game are laid bare. This is a valuable way to illustrate that complex systems are all around us, and are not merely the domain of hi-tech systems. Finally, these games can be used in a wide variety of contexts by removing computer hardware requirements and instead using materials and resources that are easily found in

  3. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  4. Status of the petroleum pollution in the Wider Caribbean Sea

    Energy Technology Data Exchange (ETDEWEB)

    Botello, Alfonso V; Villanueva F, Susana [Universidad Nacional Autonoma de Mexico, Mexico City (Mexico). Inst. de Ciencias del Mar y Limnologia

    1996-07-01

    In 1976, the IOC-UNESCO and UNEP convened a meeting in Port of Spain to analyze the marine pollution problems in the region and noted that petroleum pollution was of region-wide concern and recommended to initiate a research and monitoring program to determine the severity of the problem and monitor its effects. Actually, the Wider Caribbean is potentially one of the largest oil producing areas in the world. Major production sites include Louisiana and Texas; USA; the Bay of Campeche, Mexico; Lake Maracaibo, Venezuela; and the Gulf of Paria, Trinidad; all which are classified as production accident high-risk zones. Main sources of petroleum pollution in the Wider Caribbean are: production, exploitation, transportation, urban and municipal discharges, refining and chemical wastes, normal loading operations and accidental spills. About 5 million of barrels are transported daily in the Caribbean, thus generating an intense tanker traffic. It has been estimated that oil discharges from tank washings within the Wider Caribbean could be as high as 7 millions barrels/year. The results of the CARIPOL Regional Programme conducted between 1980-1987 pointed out that a significant levels of petroleum pollution exists throughout the Wider Caribbean and include serious tar contamination of windward exposed beaches, high levels of floating tar within the major currents system and very high levels of dissolved/dispersed hydrocarbons in surface waters. Major effects of this petroleum pollution include: high tar level on many beaches that either prevent recreational use or require very expensive clean-up operations, distress and death to marine life and responses in the enzyme systems of marine organisms that have been correlated with declines in reproductive success. Finally the presence of polycyclic aromatic hydrocarbons in tissues of important economic species have been reported with its potential carcinogenic effects. (author)

  5. Status of the petroleum pollution in the Wider Caribbean Sea

    International Nuclear Information System (INIS)

    Botello, Alfonso V.; Villanueva F, Susana

    1996-01-01

    In 1976, the IOC-UNESCO and UNEP convened a meeting in Port of Spain to analyze the marine pollution problems in the region and noted that petroleum pollution was of region-wide concern and recommended to initiate a research and monitoring program to determine the severity of the problem and monitor its effects. Actually, the Wider Caribbean is potentially one of the largest oil producing areas in the world. Major production sites include Louisiana and Texas; USA; the Bay of Campeche, Mexico; Lake Maracaibo, Venezuela; and the Gulf of Paria, Trinidad; all which are classified as production accident high-risk zones. Main sources of petroleum pollution in the Wider Caribbean are: production, exploitation, transportation, urban and municipal discharges, refining and chemical wastes, normal loading operations and accidental spills. About 5 million of barrels are transported daily in the Caribbean, thus generating an intense tanker traffic. It has been estimated that oil discharges from tank washings within the Wider Caribbean could be as high as 7 millions barrels/year. The results of the CARIPOL Regional Programme conducted between 1980-1987 pointed out that a significant levels of petroleum pollution exists throughout the Wider Caribbean and include serious tar contamination of windward exposed beaches, high levels of floating tar within the major currents system and very high levels of dissolved/dispersed hydrocarbons in surface waters. Major effects of this petroleum pollution include: high tar level on many beaches that either prevent recreational use or require very expensive clean-up operations, distress and death to marine life and responses in the enzyme systems of marine organisms that have been correlated with declines in reproductive success. Finally the presence of polycyclic aromatic hydrocarbons in tissues of important economic species have been reported with its potential carcinogenic effects. (author)

  6. Communicating space weather to policymakers and the wider public

    Science.gov (United States)

    Ferreira, Bárbara

    2014-05-01

    As a natural hazard, space weather has the potential to affect space- and ground-based technological systems and cause harm to human health. As such, it is important to properly communicate this topic to policymakers and the general public alike, informing them (without being unnecessarily alarmist) about the potential impact of space-weather phenomena and how these can be monitored and mitigated. On the other hand, space weather is related to interesting phenomena on the Sun such as coronal-mass ejections, and incorporates one of the most beautiful displays in the Earth and its nearby space environment: aurora. These exciting and fascinating aspects of space weather should be cultivated when communicating this topic to the wider public, particularly to younger audiences. Researchers have a key role to play in communicating space weather to both policymakers and the wider public. Space scientists should have an active role in informing policy decisions on space-weather monitoring and forecasting, for example. And they can exercise their communication skills by talking about space weather to school children and the public in general. This presentation will focus on ways to communicate space weather to wider audiences, particularly policymakers. It will also address the role researchers can play in this activity to help bridge the gap between the space science community and the public.

  7. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  8. Sustainability rating tools for buildings and its wider application

    Directory of Open Access Journals (Sweden)

    Siew Renard

    2017-01-01

    Full Text Available This paper provides a commentary on the latest research in measuring the sustainability of buildings and its wider application. The emergence of sustainability rating tools (SRTs has faced critique from scholars due to their deficiencies such as the overemphasis on environmental criteria, the negligence of uncertainty in scoring and existence of non-scientific criteria benchmarks among many others. This could have attributed to the mixed evidence in the literature on the benefits of SRTs. Future research direction is proposed to advance the state-of-the art in this field.

  9. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT

    Directory of Open Access Journals (Sweden)

    Mair Frances

    2010-10-01

    Full Text Available Abstract Background The use of Information and Communication Technology (ICT or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice. This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience. Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations.

  10. Opportunities and challenges in the wider adoption of liver and interconnected microphysiological systems.

    Science.gov (United States)

    Hughes, David J; Kostrzewski, Tomasz; Sceats, Emma L

    2017-10-01

    Liver disease represents a growing global health burden. The development of in vitro liver models which allow the study of disease and the prediction of metabolism and drug-induced liver injury in humans remains a challenge. The maintenance of functional primary hepatocytes cultures, the parenchymal cell of the liver, has historically been difficult with dedifferentiation and the consequent loss of hepatic function limiting utility. The desire for longer term functional liver cultures sparked the development of numerous systems, including collagen sandwiches, spheroids, micropatterned co-cultures and liver microphysiological systems. This review will focus on liver microphysiological systems, often referred to as liver-on-a-chip, and broaden to include platforms with interconnected microphysiological systems or multi-organ-chips. The interconnection of microphysiological systems presents the opportunity to explore system level effects, investigate organ cross talk, and address questions which were previously the preserve of animal experimentation. As a field, microphysiological systems have reached a level of maturity suitable for commercialization and consequent evaluation by a wider community of users, in academia and the pharmaceutical industry. Here scientific, operational, and organizational considerations relevant to the wider adoption of microphysiological systems will be discussed. Applications in which microphysiological systems might offer unique scientific insights or enable studies currently feasible only with animal models are described, and challenges which might be addressed to enable wider adoption of the technologies are highlighted. A path forward which envisions the development of microphysiological systems in partnerships between academia, vendors and industry, is proposed. Impact statement Microphysiological systems are in vitro models of human tissues and organs. These systems have advanced rapidly in recent years and are now being

  11. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  12. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  13. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  14. Texas Team: Academic Progression and IOM Toolkit.

    Science.gov (United States)

    Reid, Helen; Tart, Kathryn; Tietze, Mari; Joseph, Nitha Mathew; Easley, Carson

    The Institute of Medicine (IOM) Future of Nursing report, identified eight recommendations for nursing to improve health care for all Americans. The Texas Team for Advancing Health Through Nursing embraced the challenge of implementing the recommendations through two diverse projects. One group conducted a broad, online survey of leadership, practice, and academia, focusing on the IOM recommendations. The other focused specifically on academic progression through the use of CABNET (Consortium for Advancing Baccalaureate Nursing Education in Texas) articulation agreements. The survey revealed a lack of knowledge and understanding of the IOM recommendations, prompting development of an online IOM toolkit. The articulation agreements provide a clear pathway for students to the RN-to-BSN degree students. The toolkit and articulation agreements provide rich resources for implementation of the IOM recommendations.

  15. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  16. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  17. A toolkit for promoting healthy ageing

    OpenAIRE

    Knevel, Jeroen; Gruppen, Aly

    2016-01-01

    This toolkit therefore focusses on self-management abilities. That means finding and maintaining effective, positive coping methods in relation to our health. We included many common and frequently discussed topics such as drinking, eating, physical exercise, believing in the future, resilience, preventing loneliness and social participation. Besides some concise background information, we offer you a great diversity of exercises per theme which can help you discuss, assess, change or strengt...

  18. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  19. Computational Chemistry Toolkit for Energetic Materials Design

    Science.gov (United States)

    2006-11-01

    industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6

  20. chemf: A purely functional chemistry toolkit.

    Science.gov (United States)

    Höck, Stefan; Riedl, Rainer

    2012-12-20

    Although programming in a type-safe and referentially transparent style offers several advantages over working with mutable data structures and side effects, this style of programming has not seen much use in chemistry-related software. Since functional programming languages were designed with referential transparency in mind, these languages offer a lot of support when writing immutable data structures and side-effects free code. We therefore started implementing our own toolkit based on the above programming paradigms in a modern, versatile programming language. We present our initial results with functional programming in chemistry by first describing an immutable data structure for molecular graphs together with a couple of simple algorithms to calculate basic molecular properties before writing a complete SMILES parser in accordance with the OpenSMILES specification. Along the way we show how to deal with input validation, error handling, bulk operations, and parallelization in a purely functional way. At the end we also analyze and improve our algorithms and data structures in terms of performance and compare it to existing toolkits both object-oriented and purely functional. All code was written in Scala, a modern multi-paradigm programming language with a strong support for functional programming and a highly sophisticated type system. We have successfully made the first important steps towards a purely functional chemistry toolkit. The data structures and algorithms presented in this article perform well while at the same time they can be safely used in parallelized applications, such as computer aided drug design experiments, without further adjustments. This stands in contrast to existing object-oriented toolkits where thread safety of data structures and algorithms is a deliberate design decision that can be hard to implement. Finally, the level of type-safety achieved by Scala highly increased the reliability of our code as well as the productivity of

  1. VIDE: The Void IDentification and Examination toolkit

    Science.gov (United States)

    Sutter, P. M.; Lavaux, G.; Hamaus, N.; Pisani, A.; Wandelt, B. D.; Warren, M.; Villaescusa-Navarro, F.; Zivick, P.; Mao, Q.; Thompson, B. B.

    2015-03-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N-body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a substantially enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and performing a watershed transform to construct voids. Additionally, VIDE provides significant functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE uses the watershed levels to place voids in a hierarchical tree, outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysis tasks, such as loading and manipulating void catalogs and particle members, filtering, plotting, computing clustering statistics, stacking, comparing catalogs, and fitting density profiles. While centered around ZOBOV, the toolkit is designed to be as modular as possible and accommodate other void finders. VIDE has been in development for several years and has already been used to produce a wealth of results, which we summarize in this work to highlight the capabilities of the toolkit. VIDE is publicly available at http://bitbucket.org/cosmicvoids/vide_public and http://www.cosmicvoids.net.

  2. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    Science.gov (United States)

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  3. A Python Geospatial Language Toolkit

    Science.gov (United States)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  4. The Wider Implications of Business-model Research

    DEFF Research Database (Denmark)

    Ritter, Thomas; Lettl, Christopher

    2018-01-01

    Business-model research has struggled to develop a clear footprint in the strategic management field. This introduction to the special issue on the wider implications of business-model research argues that part of this struggle relates to the application of five different perspectives on the term...... “business model,” which creates ambiguity about the conceptual boundaries of business models, the applied terminology, and the potential contributions of business-model research to strategic management literature. By explicitly distinguishing among these five perspectives and by aligning them into one...... overarching, comprehensive framework, this paper offers a foundation for consolidating business-model research. Furthermore, we explore the connections between business-model research and prominent theories in strategic management. We conclude that business-model research is not necessarily a “theory on its...

  5. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  6. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  7. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  8. Tips from the toolkit: 1 - know yourself.

    Science.gov (United States)

    Steer, Neville

    2010-01-01

    High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  9. OGSA Globus Toolkits evaluation activity at CERN

    CERN Document Server

    Chen, D; Foster, D; Kalyaev, V; Kryukov, A; Lamanna, M; Pose, V; Rocha, R; Wang, C

    2004-01-01

    An Open Grid Service Architecture (OGSA) Globus Toolkit 3 (GT3) evaluation group is active at CERN since GT3 was available in early beta version (Spring 2003). This activity focuses on the evaluation of the technology as promised by the OGSA/OGSI paradigm and on GT3 in particular. The goal is to study this new technology and its implications with the goal to provide useful input for the large grid initiatives active in the LHC Computing Grid (LCG) project. A particular effort has been devoted to investigate performance and deployment issues, having in mind the LCG requirements, in particular scalability and robustness.

  10. Livermore Big Artificial Neural Network Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  11. GENFIT - a generic track-fitting toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Johannes [Technische Universitaet Muenchen (Germany); Schlueter, Tobias [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2014-07-01

    GENFIT is an experiment-independent track-fitting toolkit, which combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation and alignment.

  12. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  13. Communities and Spontaneous Urban Planning: A Toolkit for Urban ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    State-led urban planning is often absent, which creates unsustainable environments and hinders the integration of migrants. Communities' prospects of ... This toolkit is expected to be a viable alternative for planning urban expansion wherever it cannot be carried out through traditional means. The toolkit will be tested in ...

  14. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  15. Design-based learning in classrooms using playful digital toolkits

    NARCIS (Netherlands)

    Scheltenaar, K.J.; van der Poel, J.E.C.; Bekker, Tilde

    2015-01-01

    The goal of this paper is to explore how to implement Design Based Learning (DBL) with digital toolkits to teach 21st century skills in (Dutch) schools. It describes the outcomes of a literature study and two design case studies in which such a DBL approach with digital toolkits was iteratively

  16. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    Science.gov (United States)

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  17. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  18. Saharan Rock Art: Local Dynamics and Wider Perspectives

    Directory of Open Access Journals (Sweden)

    Marina Gallinaro

    2013-12-01

    Full Text Available Rock art is the best known evidence of the Saharan fragile heritage. Thousands of engraved and painted artworks dot boulders and cliffs in open-air sites, as well as the rock walls of rockshelters and caves located in the main massifs. Since its pioneering discovery in the late 19th century, rock art captured the imagination of travellers and scholars, representing for a long time the main aim of research in the area. Chronology, meaning and connections between the different recognized artistic provinces are still to be fully understood. The central massifs, and in particular the "cultural province" encompassing Tadrart Acacus and Tassili n’Ajer, played and still play a key role in this scenario. Recent analytical and contextual analyses of rock art contexts seem to open new perspectives. Tadrart Acacus, for the richness and variability of artworks, for the huge archaeological data known, and for its proximity to other important areas with rock art (Tassili n’Ajjer, Algerian Tadrart and Messak massifs is an ideal context to analyze the artworks in their environmental and social-cultural context, and to define connections between cultural local dynamics and wider regional perspectives.

  19. Energy retrofit analysis toolkits for commercial buildings: A review

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Taylor-Lange, Sarah C.

    2015-01-01

    Retrofit analysis toolkits can be used to optimize energy or cost savings from retrofit strategies, accelerating the adoption of ECMs (energy conservation measures) in buildings. This paper provides an up-to-date review of the features and capabilities of 18 energy retrofit toolkits, including ECMs and the calculation engines. The fidelity of the calculation techniques, a driving component of retrofit toolkits, were evaluated. An evaluation of the issues that hinder effective retrofit analysis in terms of accessibility, usability, data requirement, and the application of efficiency measures, provides valuable insights into advancing the field forward. Following this review the general concepts were determined: (1) toolkits developed primarily in the private sector use empirically data-driven methods or benchmarking to provide ease of use, (2) almost all of the toolkits which used EnergyPlus or DOE-2 were freely accessible, but suffered from complexity, longer data input and simulation run time, (3) in general, there appeared to be a fine line between having too much detail resulting in a long analysis time or too little detail which sacrificed modeling fidelity. These insights provide an opportunity to enhance the design and development of existing and new retrofit toolkits in the future. - Highlights: • Retrofit analysis toolkits can accelerate the adoption of energy efficiency measures. • A comprehensive review of 19 retrofit analysis toolkits was conducted. • Retrofit toolkits have diverse features, data requirement and computing methods. • Empirical data-driven, normative and detailed energy modeling methods are used. • Identified immediate areas for improvement for retrofit analysis toolkits

  20. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  1. The Best Ever Alarm System Toolkit

    International Nuclear Information System (INIS)

    Kasemir, Kay; Chen, Xihui; Danilova, Ekaterina N.

    2009-01-01

    Learning from our experience with the standard Experimental Physics and Industrial Control System (EPICS) alarm handler (ALH) as well as a similar intermediate approach based on script-generated operator screens, we developed the Best Ever Alarm System Toolkit (BEAST). It is based on Java and Eclipse on the Control System Studio (CSS) platform, using a relational database (RDB) to store the configuration and log actions. It employs a Java Message Service (JMS) for communication between the modular pieces of the toolkit, which include an Alarm Server to maintain the current alarm state, an arbitrary number of Alarm Client user interfaces (GUI), and tools to annunciate alarms or log alarm related actions. Web reports allow us to monitor the alarm system performance and spot deficiencies in the alarm configuration. The Alarm Client GUI not only gives the end users various ways to view alarms in tree and table, but also makes it easy to access the guidance information, the related operator displays and other CSS tools. It also allows online configuration to be simply modified from the GUI. Coupled with a good 'alarm philosophy' on how to provide useful alarms, we can finally improve the configuration to achieve an effective alarm system.

  2. Applications toolkit for accelerator control and analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1997-01-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator operation and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provide a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 70 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creating of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This last factor allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to operational tasks such as orbit correction, configuration management, and data review will be discussed

  3. IN MY OPINION: Physics in the wider context

    Science.gov (United States)

    Morris, Andrew

    1999-11-01

    and progression opportunities for science specialists, whilst ensuring that the general public are scientifically literate. I think physics education has a serious contribution to make to all sections of society:The specialist, preparing for and progressing in a scientific/technological career. The skilled worker, analysing, understanding and innovating in any occupation. The citizen coping with increasing complexity in society. The individual trying to understanding the world into which they were born. To continue improving our educational systems and to assist each of these groups demands a grand alliance of people involved in physics education. Reflecting first on the wider context can help us choose appropriate points at which to intervene. Otherwise, educational improvement may be hampered, with valuable effort expended on positive reform actions rendered useless by constraints elsewhere in the system. How has the subject and its place in the curriculum evolved? What can be learned from previous curriculum innovations? What do public perceptions of physics tell us? The aim of the fifth Shaping the Future booklet is to encourage debate about where reform efforts should best be directed. Contributors will include Steve Adams, Michael Barnett, Sheila Carlton, John Berkeley, Martin Hollins, Marilyn Holyoake, Andrew Hunt, Roland Jackson, Jon Ogborn, Russell Stannard and Charles Thomas. A Discussion Meeting based on Physics in a wider context, at the ASE Annual Meeting, Leeds, promises to be lively. I hope you will come and express your views! If you would like to attend the meeting, to be held on 7 January 2000, and be sent a free copy of the manuscript for the 48 page booklet in advance, please contact: Ingrid Ebeyer, Post-16 Initiative, Institute of Physics, 76 Portland Place, London W1N 3DH (e-mail: 16-19project@iop.org)

  4. The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results

    Science.gov (United States)

    Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee

    2016-01-01

    Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.

  5. Engaging wider publics with studying and protecting the ocean

    Science.gov (United States)

    Nauen, Cornelia E.

    2015-04-01

    The ocean is dying. The vast scientific literature diagnoses massive reductions in the biomass of fish and invertebrates from overfishing, increasing destruction of coral ecosystems in the tropics from climate change, extensive dead zones from eutrophication and collapse of marine bird populations from ingesting plastic. Even though Darwin suspected already The scale is becoming apparent only from meta-analyses at regional or even global scales as individual studies tend to focus on one fishery or one type of organisms or geographic location. In combination with deep rooted perceptions of the vastness of the ocean the changes are difficult to comprehend for specialists and the general public alike. Even though more than half of humanity is estimated to live in coastal zones as defined by some, urbanisation is removing about half from regular, more direct exposure. Yet, there is much still to be explored, not only in the deep, little studied, parts. The ocean exercises great fascination on many people heightened since the period of discovery and the mystery of far-flung places, but the days, when Darwin's research results were regularly discussed in public spaces are gone. Rachel Carson's prize-winning and best selling book "The Sea Around Us", some serialised chapters in magazines and condensations in "Reader's Digest" transported the poetic rendering of science again to a wider public. But compared to the diversity of scientific inquiry about the ocean and importance for life-support system earth there is much room for engaging ocean science in the broad sense with larger and diverse publics. Developing new narratives rooted in the best available sciences is among the most promising modes of connecting different areas of scientific inquiry and non-specialists alike. We know at latest since Poincaré's famous dictum that "the facts don't speak". However, contextualised information can capture the imagination of the many and thus also reveal unexpected connections

  6. The Insight ToolKit Image Registration Framework

    Directory of Open Access Journals (Sweden)

    Brian eAvants

    2014-04-01

    Full Text Available Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit ( ITK4 seeks to es- tablish new standards in publicly available image registration methodology. ITK4 makes severaladvances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field and low-dimensional (affine transformations with metrics that are reusable across transform types and with com- posite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available.Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations versus translations. A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher tomore easily focus on design/comparison of registration strategies. In total, the ITK4 contribu- tion is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextu- alize this work with a reference registration evaluation study with application to pediatric brainlabeling.

  7. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  8. A Cas9-based toolkit to program gene expression in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Apel, Amanda Reider; d'Espaux, Leo; Wehrs, Maren

    2017-01-01

    of these parts via a web-based tool, that automates the generation of DNA fragments for integration. Our system builds upon existing gene editing methods in the thoroughness with which the parts are standardized and characterized, the types and number of parts available and the ease with which our methodology...... can be used to perform genetic edits in yeast. We demonstrated the applicability of this toolkit by optimizing the expression of a challenging but industrially important enzyme, taxadiene synthase (TXS). This approach enabled us to diagnose an issue with TXS solubility, the resolution of which yielded...

  9. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  10. Sierra Toolkit Manual Version 4.48.

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Toolkit Team

    2018-03-01

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall out of date. This page intentionally left blank.

  11. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  12. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  13. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  14. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  15. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing

    DEFF Research Database (Denmark)

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai

    2018-01-01

    engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification......, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector...... construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes....

  16. Improving safety on rural local and tribal roads safety toolkit.

    Science.gov (United States)

    2014-08-01

    Rural roadway safety is an important issue for communities throughout the country and presents a challenge for state, local, and Tribal agencies. The Improving Safety on Rural Local and Tribal Roads Safety Toolkit was created to help rural local ...

  17. A User Interface Toolkit for a Small Screen Device.

    OpenAIRE

    UOTILA, ALEKSI

    2000-01-01

    The appearance of different kinds of networked mobile devices and network appliances creates special requirements for user interfaces that are not met by existing widget based user interface creation toolkits. This thesis studies the problem domain of user interface creation toolkits for portable network connected devices. The portable nature of these devices places great restrictions on the user interface capabilities. One main characteristic of the devices is that they have small screens co...

  18. WING/WORLD: An Open Experimental Toolkit for the Design and Deployment of IEEE 802.11-Based Wireless Mesh Networks Testbeds

    Directory of Open Access Journals (Sweden)

    Daniele Miorandi

    2010-01-01

    Full Text Available Wireless Mesh Networks represent an interesting instance of light-infrastructure wireless networks. Due to their flexibility and resiliency to network failures, wireless mesh networks are particularly suitable for incremental and rapid deployments of wireless access networks in both metropolitan and rural areas. This paper illustrates the design and development of an open toolkit aimed at supporting the design of different solutions for wireless mesh networking by enabling real evaluation, validation, and demonstration. The resulting testbed is based on off-the-shelf hardware components and open-source software and is focused on IEEE 802.11 commodity devices. The software toolkit is based on an “open” philosophy and aims at providing the scientific community with a tool for effective and reproducible performance analysis of WMNs. The paper describes the architecture of the toolkit, and its core functionalities, as well as its potential evolutions.

  19. Management of cancer pain: 1. Wider implications of orthodox analgesics

    Directory of Open Access Journals (Sweden)

    Lee SK

    2014-01-01

    significantly affect the cancer process itself. More futuristically, several ion channels are being targeted with novel analgesics, but many of these are also involved in primary and/or secondary tumorigenesis. Further studies are needed to elucidate possible cellular and molecular effects of orthodox analgesics and their possible long-term impact, both positive and negative, and thus enable the best possible clinical gain for cancer patients.Keywords: NSAIDs, cannabinoids, opioids, GABA-ergic drugs, GABA mimetics, ion channels

  20. VaST: A variability search toolkit

    Science.gov (United States)

    Sokolovsky, K. V.; Lebedev, A. A.

    2018-01-01

    Variability Search Toolkit (VaST) is a software package designed to find variable objects in a series of sky images. It can be run from a script or interactively using its graphical interface. VaST relies on source list matching as opposed to image subtraction. SExtractor is used to generate source lists and perform aperture or PSF-fitting photometry (with PSFEx). Variability indices that characterize scatter and smoothness of a lightcurve are computed for all objects. Candidate variables are identified as objects having high variability index values compared to other objects of similar brightness. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and handle complex image distortions. The software has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5 m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates. About 1800 variable stars have been discovered with VaST. It is used as a transient detection engine in the New Milky Way (NMW) nova patrol. The code is written in C and can be easily compiled on the majority of UNIX-like systems. VaST is free software available at http://scan.sai.msu.ru/vast/.

  1. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  2. Security Assessment Simulation Toolkit (SAST) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  3. BioWarehouse: a bioinformatics database warehouse toolkit

    Directory of Open Access Journals (Sweden)

    Stringer-Calvert David WJ

    2006-03-01

    Full Text Available Abstract Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the

  4. BioWarehouse: a bioinformatics database warehouse toolkit.

    Science.gov (United States)

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  5. A qualitative study of clinic and community member perspectives on intervention toolkits: "Unless the toolkit is used it won't help solve the problem".

    Science.gov (United States)

    Davis, Melinda M; Howk, Sonya; Spurlock, Margaret; McGinnis, Paul B; Cohen, Deborah J; Fagnan, Lyle J

    2017-07-18

    Intervention toolkits are common products of grant-funded research in public health and primary care settings. Toolkits are designed to address the knowledge translation gap by speeding implementation and dissemination of research into practice. However, few studies describe characteristics of effective intervention toolkits and their implementation. Therefore, we conducted this study to explore what clinic and community-based users want in intervention toolkits and to identify the factors that support application in practice. In this qualitative descriptive study we conducted focus groups and interviews with a purposive sample of community health coalition members, public health experts, and primary care professionals between November 2010 and January 2012. The transdisciplinary research team used thematic analysis to identify themes and a cross-case comparative analysis to explore variation by participant role and toolkit experience. Ninety six participants representing primary care (n = 54, 56%) and community settings (n = 42, 44%) participated in 18 sessions (13 focus groups, five key informant interviews). Participants ranged from those naïve through expert in toolkit development; many reported limited application of toolkits in actual practice. Participants wanted toolkits targeted at the right audience and demonstrated to be effective. Well organized toolkits, often with a quick start guide, with tools that were easy to tailor and apply were desired. Irrespective of perceived quality, participants experienced with practice change emphasized that leadership, staff buy-in, and facilitative support was essential for intervention toolkits to be translated into changes in clinic or public -health practice. Given the emphasis on toolkits in supporting implementation and dissemination of research and clinical guidelines, studies are warranted to determine when and how toolkits are used. Funders, policy makers, researchers, and leaders in primary care and

  6. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    Science.gov (United States)

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  7. An open source toolkit for medical imaging de-identification

    International Nuclear Information System (INIS)

    Rodriguez Gonzalez, David; Carpenter, Trevor; Wardlaw, Joanna; Hemert, Jano I. van

    2010-01-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users. (orig.)

  8. The self-describing data sets file protocol and Toolkit

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.

    1995-01-01

    The Self-Describing Data Sets (SDDS) file protocol continues to be used extensively in commissioning the Advanced Photon Source (APS) accelerator complex. SDDS protocol has proved useful primarily due to the existence of the SDDS Toolkit, a growing set of about 60 generic commandline programs that read and/or write SDDS files. The SDDS Toolkit is also used extensively for simulation postprocessing, giving physicists a single environment for experiment and simulation. With the Toolkit, new SDDS data is displayed and subjected to complex processing without developing new programs. Data from EPICS, lab instruments, simulation, and other sources are easily integrated. Because the SDDS tools are commandline-based, data processing scripts are readily written using the user's preferred shell language. Since users work within a UNIX shell rather than an application-specific shell or GUI, they may add SDDS-compliant programs and scripts to their personal toolkits without restriction or complication. The SDDS Toolkit has been run under UNIX on SUN OS4, HP-UX, and LINUX. Application of SDDS to accelerator operation is being pursued using Tcl/Tk to provide a GUI

  9. A Generalized Software Toolkit for Portable GPU-Enabled Chemistry Acceleration in CFD Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Current combustor design simulations aimed at reducing greenhouse gas emissions and improving fuel-lean combustion have entailed using large amounts of dedicated CPU...

  10. Dosimetry applications in GATE Monte Carlo toolkit.

    Science.gov (United States)

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Validation of Power Output for the WIND Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    King, J.; Clifton, A.; Hodge, B. M.

    2014-09-01

    Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.

  12. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  13. The ECVET toolkit customization for the nuclear energy sector

    International Nuclear Information System (INIS)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von

    2015-01-01

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  14. The ECVET toolkit customization for the nuclear energy sector

    Energy Technology Data Exchange (ETDEWEB)

    Ceclan, Mihail; Ramos, Cesar Chenel; Estorff, Ulrike von [European Commission, Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport

    2015-04-15

    As part of its support to the introduction of ECVET in the nuclear energy sector, the Institute for Energy and Transport (IET) of the Joint Research Centre (JRC), European Commission (EC), through the ECVET Team of the European Human Resources Observatory for the Nuclear energy sector (EHRO-N), developed in the last six years (2009-2014) a sectorial approach and a road map for ECVET implementation in the nuclear energy sector. In order to observe the road map for the ECVET implementation, the toolkit customization for nuclear energy sector is required. This article describes the outcomes of the toolkit customization, based on ECVET approach, for nuclear qualifications design. The process of the toolkit customization took into account the fact that nuclear qualifications are mostly of higher levels (five and above) of the European Qualifications Framework.

  15. Outage Risk Assessment and Management (ORAM) thermal-hydraulics toolkit

    International Nuclear Information System (INIS)

    Denny, V.E.; Wassel, A.T.; Issacci, F.; Pal Kalra, S.

    2004-01-01

    A PC-based thermal-hydraulic toolkit for use in support of outage optimization, management and risk assessment has been developed. This mechanistic toolkit incorporates simple models of key thermal-hydraulic processes which occur during an outage, such as recovery from or mitigation of outage upsets; this includes heat-up of water pools following loss of shutdown cooling, inadvertent drain down of the RCS, boiloff of coolant inventory, heatup of the uncovered core, and reflux cooling. This paper provides a list of key toolkit elements, briefly describes the technical basis and presents illustrative results for RCS transient behavior during reflux cooling, peak clad temperatures for an uncovered core and RCS response to loss of shutdown cooling. (author)

  16. Development of a Multimedia Toolkit for Engineering Graphics Education

    Directory of Open Access Journals (Sweden)

    Moudar Zgoul

    2009-09-01

    Full Text Available This paper focuses upon the development of a multimedia toolkit to support the teaching of Engineering Graphics Course. The project used different elements for the toolkit; animations, videos and presentations which were then integrated in a dedicated internet website. The purpose of using these elements is to assist the students building and practicing the needed engineering skills at their own pace as a part of an e-Learning solution. Furthermore, this kit allows students to repeat and view the processes and techniques of graphical construction, and visualization as much as needed, allowing them to follow and practice on their own.

  17. User's manual for the two-dimensional transputer graphics toolkit

    Science.gov (United States)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  18. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available of such a toolkit facilitates and increases productivity during subsequent tool development: “develop once and use many times”. The concept of an extendible toolkit lends itself naturally to the open-source philosophy, where the toolkit user-base develops...

  19. phylo-node: A molecular phylogenetic toolkit using Node.js.

    Science.gov (United States)

    O'Halloran, Damien M

    2017-01-01

    Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis. To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines. phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node.

  20. J-TEXT-EPICS: An EPICS toolkit attempted to improve productivity

    International Nuclear Information System (INIS)

    Zheng, Wei; Zhang, Ming; Zhang, Jing; Zhuang, Ge

    2013-01-01

    Highlights: • Tokamak control applications can be developed in very short period with J-TEXT-EPICS. • J-TEXT-EPICS enables users to build control applications with device-oriented functions. • J-TEXT-EPICS is fully compatible with EPICS Channel Access protocol. • J-TEXT-EPICS can be easily extended by plug-ins and drivers. -- Abstract: The Joint Texas Experimental Tokamak (J-TEXT) team has developed a new software toolkit for building Experimental Physics and Industrial Control System (EPICS) control applications called J-TEXT-EPICS. It aims to improve the development efficiency of control applications. With device-oriented features, it can be used to set or obtain the configuration or status of a device as well as invoke methods on a device. With its modularized design, its functions can be easily extended. J-TEXT-EPICS is completely compatible with the original EPICS Channel Access protocol and can be integrated into existing EPICS control systems smoothly. It is fully implemented in C number sign, thus it will benefit from abundant resources in.NET Framework. The J-TEXT control system is build with this toolkit. This paper presents the design and implementation of J-TEXT EPICS as well as its application in the J-TEXT control system

  1. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  2. The Early Astronomy Toolkit was Universal

    Science.gov (United States)

    Schaefer, Bradley E.

    2018-01-01

    From historical, anthropological, and archaeological records, we can reconstruct the general properties of the earliest astronomy for many cultures worldwide, and they all share many similar characteristics. The 'Early Astronomy Toolkit' (EAT) has the Earth being flat, and the heavens as a dome overhead populated by gods/heroes that rule Nature. The skies provided omens in a wide variety of manners, with eclipses, comets, and meteors always being evil and bad. Constellations were ubiquitous pictures of gods, heroes, animals, and everyday items; all for story telling. The calendars were all luni-solar, with no year counts and months only named by seasonal cues (including solstice observations and heliacal risings) with vague intercalation. Time of day came only from the sun's altitude/azimuth, while time at night came from star risings. Graves are oriented astronomically, and each culture has deep traditions of quartering the horizon. The most complicated astronomical tools were just a few sticks and stones. This is a higher level description and summary of the astronomy of all ancient cultures.This basic EAT was universal up until the Greeks, Mesopotamians, and Chinese broke out around 500 BC and afterwards. Outside the Eurasian milieu, with few exceptions (for example, planetary position measures in Mexico), this EAT represents astronomy for the rest of the world up until around 1600 AD. The EAT is present in these many cultures with virtually no variations or extensions. This universality must arise either from multiple independent inventions or by migration/diffusion. The probability of any culture independently inventing all 19 items in the EAT is low, but any such calculation has all the usual problems. Still, we realize that it is virtually impossible for many cultures to independently develop all 19 items in the EAT, so there must be a substantial fraction of migration of the early astronomical concepts. Further, the utter lack, as far as I know, of any

  3. Practical computational toolkits for dendrimers and dendrons structure design

    Science.gov (United States)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  4. An Ethical Toolkit for Food Companies: Reflection on its Use

    NARCIS (Netherlands)

    Deblonde, M.K.; Graaff, R.; Brom, F.W.A.

    2007-01-01

    Nowadays many debates are going on that relate to the agricultural and food sector. It looks as if present technological and organizational developments within the agricultural and food sector are badly geared to societal needs and expectations. In this article we briefly present a toolkit for moral

  5. Evaluating Teaching Development Activities in Higher Education: A Toolkit

    Science.gov (United States)

    Kneale, Pauline; Winter, Jennie; Turner, Rebecca; Spowart, Lucy; Hughes, Jane; McKenna, Colleen; Muneer, Reema

    2016-01-01

    This toolkit is developed as a resource for providers of teaching-related continuing professional development (CPD) in higher education (HE). It focuses on capturing the longer-term value and impact of CPD for teachers and learners, and moving away from immediate satisfaction measures. It is informed by the literature on evaluating higher…

  6. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  7. Report of the Los Alamos accelerator automation application toolkit workshop

    International Nuclear Information System (INIS)

    Clout, P.; Daneels, A.

    1990-01-01

    A 5 day workshop was held in November 1988 at Los Alamos National Laboratory to address the viability of providing a toolkit optimized for building accelerator control systems. The workshop arose from work started independently at Los Alamos and CERN. This paper presents the discussion and the results of the meeting. (orig.)

  8. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  9. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... relevant to reducing air pollution from oil and natural gas production and processing. The Department of... environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to.... technologies. The Toolkit will support the President's National Export Initiative by fostering export...

  10. Toolkit for healthcare facility design evaluation - some case studies

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  11. Toolkit for healthcare facility design evaluation - some case studies.

    CSIR Research Space (South Africa)

    De Jager, Peta

    2007-10-01

    Full Text Available themes in approach. Further study is indicated, but preliminary research shows that, whilst these toolkits can be applied to the South African context, there are compelling reasons for them to be adapted. This paper briefly outlines these three case...

  12. Measuring acceptance of an assistive social robot: a suggested toolkit

    NARCIS (Netherlands)

    Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B.

    2009-01-01

    The human robot interaction community is multidisciplinary by nature and has members from social science to engineering backgrounds. In this paper we aim to provide human robot developers with a straightforward toolkit to evaluate users' acceptance of assistive social robots they are designing or

  13. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    1997-01-01

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  14. The infancy of particle accelerators life and work of Rolf Widerøe

    CERN Document Server

    1994-01-01

    The following autobiographical account of Rolf Wideröe's life and work is based on manuscripts and letters written by hirnself, most ofthem especially for this report. Data from audio and video recordings with his illustrations and from my notes taken during aseries ofmeetings between the two ofus were also included. Rolf Wideröe gave me access to many of his publications and to other documents from which I have extracted further information. I have compiled, edited and, where necessary, put the texts in chronological order. These were then corrected and supplemented by Rolf Wideröe during the course of several readings. The English translation was also checked by Wideröe and we were able to add some improvements and corrections. This account there­ fore stands as an authorised biography and is written in the first person. Mrs. Wideröe's accurate memory was of great assistance. The emphasis has been on RolfWideröe's life story and the first developments which led to modem particle accelerators. Techni�...

  15. Commercial Building Energy Saver: An energy retrofit analysis toolkit

    International Nuclear Information System (INIS)

    Hong, Tianzhen; Piette, Mary Ann; Chen, Yixing; Lee, Sang Hoon; Taylor-Lange, Sarah C.; Zhang, Rongpeng; Sun, Kaiyu; Price, Phillip

    2015-01-01

    Highlights: • Commercial Building Energy Saver is a powerful toolkit for energy retrofit analysis. • CBES provides benchmarking, load shape analysis, and model-based retrofit assessment. • CBES covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • CBES includes a web app, API, and a database of energy efficiency performance. • CBES API can be extended and integrated with third party energy software tools. - Abstract: Small commercial buildings in the United States consume 47% of the total primary energy of the buildings sector. Retrofitting small and medium commercial buildings poses a huge challenge for owners because they usually lack the expertise and resources to identify and evaluate cost-effective energy retrofit strategies. This paper presents the Commercial Building Energy Saver (CBES), an energy retrofit analysis toolkit, which calculates the energy use of a building, identifies and evaluates retrofit measures in terms of energy savings, energy cost savings and payback. The CBES Toolkit includes a web app (APP) for end users and the CBES Application Programming Interface (API) for integrating CBES with other energy software tools. The toolkit provides a rich set of features including: (1) Energy Benchmarking providing an Energy Star score, (2) Load Shape Analysis to identify potential building operation improvements, (3) Preliminary Retrofit Analysis which uses a custom developed pre-simulated database and, (4) Detailed Retrofit Analysis which utilizes real-time EnergyPlus simulations. CBES includes 100 configurable energy conservation measures (ECMs) that encompass IAQ, technical performance and cost data, for assessing 7 different prototype buildings in 16 climate zones in California and 6 vintages. A case study of a small office building demonstrates the use of the toolkit for retrofit analysis. The development of CBES provides a new contribution to the field by providing a straightforward and uncomplicated decision

  16. ISRNA: an integrative online toolkit for short reads from high-throughput sequencing data.

    Science.gov (United States)

    Luo, Guan-Zheng; Yang, Wei; Ma, Ying-Ke; Wang, Xiu-Jie

    2014-02-01

    Integrative Short Reads NAvigator (ISRNA) is an online toolkit for analyzing high-throughput small RNA sequencing data. Besides the high-speed genome mapping function, ISRNA provides statistics for genomic location, length distribution and nucleotide composition bias analysis of sequence reads. Number of reads mapped to known microRNAs and other classes of short non-coding RNAs, coverage of short reads on genes, expression abundance of sequence reads as well as some other analysis functions are also supported. The versatile search functions enable users to select sequence reads according to their sub-sequences, expression abundance, genomic location, relationship to genes, etc. A specialized genome browser is integrated to visualize the genomic distribution of short reads. ISRNA also supports management and comparison among multiple datasets. ISRNA is implemented in Java/C++/Perl/MySQL and can be freely accessed at http://omicslab.genetics.ac.cn/ISRNA/.

  17. IChem: A Versatile Toolkit for Detecting, Comparing, and Predicting Protein-Ligand Interactions.

    Science.gov (United States)

    Da Silva, Franck; Desaphy, Jeremy; Rognan, Didier

    2018-03-20

    Structure-based ligand design requires an exact description of the topology of molecular entities under scrutiny. IChem is a software package that reflects the many contributions of our research group in this area over the last decade. It facilitates and automates many tasks (e.g., ligand/cofactor atom typing, identification of key water molecules) usually left to the modeler's choice. It therefore permits the detection of molecular interactions between two molecules in a very precise and flexible manner. Moreover, IChem enables the conversion of intricate three-dimensional (3D) molecular objects into simple representations (fingerprints, graphs) that facilitate knowledge acquisition at very high throughput. The toolkit is an ideal companion for setting up and performing many structure-based design computations. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  18. Monitoring the grid with the Globus Toolkit MDS4

    International Nuclear Information System (INIS)

    Schopf, Jennifer M; Pearlman, Laura; Miller, Neill; Kesselman, Carl; Foster, Ian; D'Arcy, Mike; Chervenak, Ann

    2006-01-01

    The Globus Toolkit Monitoring and Discovery System (MDS4) defines and implements mechanisms for service and resource discovery and monitoring in distributed environments. MDS4 is distinguished from previous similar systems by its extensive use of interfaces and behaviors defined in the WS-Resource Framework and WS-Notification specifications, and by its deep integration into essentially every component of the Globus Toolkit. We describe the MDS4 architecture and the Web service interfaces and behaviors that allow users to discover resources and services, monitor resource and service states, receive updates on current status, and visualize monitoring results. We present two current deployments to provide insights into the functionality that can be achieved via the use of these mechanisms

  19. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    Science.gov (United States)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  20. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  1. SwingStates: adding state machines to the swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2006-01-01

    International audience; This article describes SwingStates, a library that adds state machines to the Java Swing user interface toolkit. Unlike traditional approaches, which use callbacks or listeners to define interaction, state machines provide a powerful control structure and localize all of the interaction code in one place. SwingStates takes advantage of Java's inner classes, providing programmers with a natural syntax and making it easier to follow and debug the resulting code. SwingSta...

  2. A framework for a teaching toolkit in entrepreneurship education.

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  3. Needs assessment: blueprint for a nurse graduate orientation employer toolkit.

    Science.gov (United States)

    Cylke, Katherine

    2012-01-01

    Southern Nevada nurse employers are resistant to hiring new graduate nurses (NGNs) because of their difficulties in making the transition into the workplace. At the same time, employers consider nurse residencies cost-prohibitive. Therefore, an alternative strategy was developed to assist employers with increasing the effectiveness of existing NGN orientation programs. A needs assessment of NGNs, employers, and nursing educators was completed, and the results were used to develop a toolkit for employers.

  4. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  5. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Numerical relativity in spherical coordinates with the Einstein Toolkit

    Science.gov (United States)

    Mewes, Vassilios; Zlochower, Yosef; Campanelli, Manuela; Ruchlin, Ian; Etienne, Zachariah B.; Baumgarte, Thomas W.

    2018-04-01

    Numerical relativity codes that do not make assumptions on spatial symmetries most commonly adopt Cartesian coordinates. While these coordinates have many attractive features, spherical coordinates are much better suited to take advantage of approximate symmetries in a number of astrophysical objects, including single stars, black holes, and accretion disks. While the appearance of coordinate singularities often spoils numerical relativity simulations in spherical coordinates, especially in the absence of any symmetry assumptions, it has recently been demonstrated that these problems can be avoided if the coordinate singularities are handled analytically. This is possible with the help of a reference-metric version of the Baumgarte-Shapiro-Shibata-Nakamura formulation together with a proper rescaling of tensorial quantities. In this paper we report on an implementation of this formalism in the Einstein Toolkit. We adapt the Einstein Toolkit infrastructure, originally designed for Cartesian coordinates, to handle spherical coordinates, by providing appropriate boundary conditions at both inner and outer boundaries. We perform numerical simulations for a disturbed Kerr black hole, extract the gravitational wave signal, and demonstrate that the noise in these signals is orders of magnitude smaller when computed on spherical grids rather than Cartesian grids. With the public release of our new Einstein Toolkit thorns, our methods for numerical relativity in spherical coordinates will become available to the entire numerical relativity community.

  7. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  8. A Genetic Toolkit for Dissecting Dopamine Circuit Function in Drosophila

    Directory of Open Access Journals (Sweden)

    Tingting Xie

    2018-04-01

    Full Text Available Summary: The neuromodulator dopamine (DA plays a key role in motor control, motivated behaviors, and higher-order cognitive processes. Dissecting how these DA neural networks tune the activity of local neural circuits to regulate behavior requires tools for manipulating small groups of DA neurons. To address this need, we assembled a genetic toolkit that allows for an exquisite level of control over the DA neural network in Drosophila. To further refine targeting of specific DA neurons, we also created reagents that allow for the conversion of any existing GAL4 line into Split GAL4 or GAL80 lines. We demonstrated how this toolkit can be used with recently developed computational methods to rapidly generate additional reagents for manipulating small subsets or individual DA neurons. Finally, we used the toolkit to reveal a dynamic interaction between a small subset of DA neurons and rearing conditions in a social space behavioral assay. : The rapid analysis of how dopaminergic circuits regulate behavior is limited by the genetic tools available to target and manipulate small numbers of these neurons. Xie et al. present genetic tools in Drosophila that allow rational targeting of sparse dopaminergic neuronal subsets and selective knockdown of dopamine signaling. Keywords: dopamine, genetics, behavior, neural circuits, neuromodulation, Drosophila

  9. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  10. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  11. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  12. An interactive RADIANCE toolkit for customizable CT dose monitoring and reporting.

    Science.gov (United States)

    Cook, Tessa S; Sundaram, Anand; Boonn, William W; Kim, Woojin

    2013-08-01

    The need for tools to monitor imaging-related radiation has grown dramatically in recent years. RADIANCE, a freely available open-source dose-monitoring tool, was developed in response to the need for an informatics solution in this realm. A number of open-source as well as commercial solutions have since been developed to enable radiology practices to monitor radiation dose parameters for modalities ranging from computed tomography to radiography to fluoroscopy. However, it is not sufficient to simply collect this data; it is equally important to be able to review it in the appropriate context. Most of the currently available dose-monitoring solutions have some type of reporting capability, such as a real-time dashboard or a static report. Previous versions of RADIANCE have included a real-time dashboard with pre-set screens that plot effective dose estimates according to different criteria, as well as monthly scorecards to summarize dose estimates for individuals within a radiology practice. In this work, we present the RADIANCE toolkit, a customizable reporting solution that allows users to generate reports of interest to them, summarizing a variety of metrics that can be grouped according to useful parameters. The output of the toolkit can be used for real-time dose monitoring or scheduled reporting, such as to a quality assurance committee. Making dose parameter data more accessible and more meaningful to the user promotes dose reduction efforts such as regular protocol review and optimization, and ultimately improves patient care by decreasing unnecessary radiation exposure.

  13. The Wider Importance of Cadavers: Educational and Research Diversity from a Body Bequest Program

    Science.gov (United States)

    Cornwall, Jon; Stringer, Mark D.

    2009-01-01

    The debate surrounding the use of cadavers in teaching anatomy has focused almost exclusively on the pedagogic role of cadaver dissection in medical education. The aim of this study was to explore the wider aspects of a body bequest program for teaching and research into gross anatomy in a University setting. A retrospective audit was undertaken…

  14. Barriers and enablers in the implementation of a provider-based intervention to stimulate culturally appropriate hypertension education

    NARCIS (Netherlands)

    Beune, Erik J. A. J.; Haafkens, Joke A.; Bindels, Patrick J. E.

    2011-01-01

    Objective: To identify barriers and enablers influencing the implementation of an intervention to stimulate culturally appropriate hypertension education (CANE) among health care providers in primary care. Methods: The intervention was piloted in three Dutch health centers. It consists of a toolkit

  15. Web-based Toolkit for Dynamic Generation of Data Processors

    Science.gov (United States)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data

  16. Brownfields to green fields: Realising wider benefits from practical contaminant phytomanagement strategies.

    Science.gov (United States)

    Cundy, A B; Bardos, R P; Puschenreiter, M; Mench, M; Bert, V; Friesl-Hanl, W; Müller, I; Li, X N; Weyens, N; Witters, N; Vangronsveld, J

    2016-12-15

    Gentle remediation options (GROs) are risk management strategies or technologies involving plant (phyto-), fungi (myco-), and/or bacteria-based methods that result in a net gain (or at least no gross reduction) in soil function as well as effective risk management. GRO strategies can be customised along contaminant linkages, and can generate a range of wider economic, environmental and societal benefits in contaminated land management (and in brownfields management more widely). The application of GROs as practical on-site remedial solutions is still limited however, particularly in Europe and at trace element (typically metal and metalloid) contaminated sites. This paper discusses challenges to the practical adoption of GROs in contaminated land management, and outlines the decision support tools and best practice guidance developed in the European Commission FP7-funded GREENLAND project aimed at overcoming these challenges. The GREENLAND guidance promotes a refocus from phytoremediation to wider GROs- or phyto-management based approaches which place realisation of wider benefits at the core of site design, and where gentle remediation technologies can be applied as part of integrated, mixed, site risk management solutions or as part of "holding strategies" for vacant sites. The combination of GROs with renewables, both in terms of biomass generation but also with green technologies such as wind and solar power, can provide a range of economic and other benefits and can potentially support the return of low-level contaminated sites to productive usage, while combining GROs with urban design and landscape architecture, and integrating GRO strategies with sustainable urban drainage systems and community gardens/parkland (particularly for health and leisure benefits), has large potential for triggering GRO application and in realising wider benefits in urban and suburban systems. Quantifying these wider benefits and value (above standard economic returns) will be

  17. Engineering control of bacterial cellulose production using a genetic toolkit and a new cellulose-producing strain

    Science.gov (United States)

    Florea, Michael; Hagemann, Henrik; Santosa, Gabriella; Micklem, Chris N.; Spencer-Milnes, Xenia; de Arroyo Garcia, Laura; Paschou, Despoina; Lazenbatt, Christopher; Kong, Deze; Chughtai, Haroon; Jensen, Kirsten; Freemont, Paul S.; Kitney, Richard; Reeve, Benjamin; Ellis, Tom

    2016-01-01

    Bacterial cellulose is a strong and ultrapure form of cellulose produced naturally by several species of the Acetobacteraceae. Its high strength, purity, and biocompatibility make it of great interest to materials science; however, precise control of its biosynthesis has remained a challenge for biotechnology. Here we isolate a strain of Komagataeibacter rhaeticus (K. rhaeticus iGEM) that can produce cellulose at high yields, grow in low-nitrogen conditions, and is highly resistant to toxic chemicals. We achieved external control over its bacterial cellulose production through development of a modular genetic toolkit that enables rational reprogramming of the cell. To further its use as an organism for biotechnology, we sequenced its genome and demonstrate genetic circuits that enable functionalization and patterning of heterologous gene expression within the cellulose matrix. This work lays the foundations for using genetic engineering to produce cellulose-based materials, with numerous applications in basic science, materials engineering, and biotechnology. PMID:27247386

  18. The Revolution Continues: Newly Discovered Systems Expand the CRISPR-Cas Toolkit.

    Science.gov (United States)

    Murugan, Karthik; Babu, Kesavan; Sundaresan, Ramya; Rajan, Rakhi; Sashital, Dipali G

    2017-10-05

    CRISPR-Cas systems defend prokaryotes against bacteriophages and mobile genetic elements and serve as the basis for revolutionary tools for genetic engineering. Class 2 CRISPR-Cas systems use single Cas endonucleases paired with guide RNAs to cleave complementary nucleic acid targets, enabling programmable sequence-specific targeting with minimal machinery. Recent discoveries of previously unidentified CRISPR-Cas systems have uncovered a deep reservoir of potential biotechnological tools beyond the well-characterized Type II Cas9 systems. Here we review the current mechanistic understanding of newly discovered single-protein Cas endonucleases. Comparison of these Cas effectors reveals substantial mechanistic diversity, underscoring the phylogenetic divergence of related CRISPR-Cas systems. This diversity has enabled further expansion of CRISPR-Cas biotechnological toolkits, with wide-ranging applications from genome editing to diagnostic tools based on various Cas endonuclease activities. These advances highlight the exciting prospects for future tools based on the continually expanding set of CRISPR-Cas systems. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. ImTK: an open source multi-center information management toolkit

    Science.gov (United States)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  20. The ethics of drug development and promotion: the need for a wider view.

    Science.gov (United States)

    Brody, Howard

    2012-11-01

    Ethical issues at the interface between the medical profession and the pharmaceutical industry have generally been approached from the vantage point of medical professionalism, with a focus on conflict of interest as the key ethical concern. Although conflicts of interest remain important, other ethical issues may be obscured unless a wider perspective is adopted. Besides medical professionalism, the ethics of the clinical therapeutic relationship, ethics of public health, and business ethics all provide additional insights.

  1. Earlinet database: new design and new products for a wider use of aerosol lidar data

    Science.gov (United States)

    Mona, Lucia; D'Amico, Giuseppe; Amato, Francesco; Linné, Holger; Baars, Holger; Wandinger, Ulla; Pappalardo, Gelsomina

    2018-04-01

    The EARLINET database is facing a complete reshaping to meet the wide request for more intuitive products and to face the even wider request related to the new initiatives such as Copernicus, the European Earth observation programme. The new design has been carried out in continuity with the past, to take advantage from long-term database. In particular, the new structure will provide information suitable for synergy with other instruments, near real time (NRT) applications, validation and process studies and climate applications.

  2. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  3. Developing a proxy version of the Adult social care outcome toolkit (ASCOT).

    Science.gov (United States)

    Rand, Stacey; Caiels, James; Collins, Grace; Forder, Julien

    2017-05-19

    Social care-related quality of life is a key outcome indicator used in the evaluation of social care interventions and policy. It is not, however, always possible to collect quality of life data by self-report even with adaptations for people with cognitive or communication impairments. A new proxy-report version of the Adult Social Care Outcomes Toolkit (ASCOT) measure of social care-related quality of life was developed to address the issues of wider inclusion of people with cognitive or communication difficulties who may otherwise be systematically excluded. The development of the proxy-report ASCOT questionnaire was informed by literature review and earlier work that identified the key issues and challenges associated with proxy-reported outcomes. To evaluate the acceptability and content validity of the ASCOT-Proxy, qualitative cognitive interviews were conducted with unpaid carers or care workers of people with cognitive or communication impairments. The proxy respondents were invited to 'think aloud' while completing the questionnaire. Follow-up probes were asked to elicit further detail of the respondent's comprehension of the format, layout and content of each item and also how they weighed up the options to formulate a response. A total of 25 unpaid carers and care workers participated in three iterative rounds of cognitive interviews. The findings indicate that the items were well-understood and the concepts were consistent with the item definitions for the standard self-completion version of ASCOT with minor modifications to the draft ASCOT-Proxy. The ASCOT-Proxy allows respondents to rate the proxy-proxy and proxy-patient perspectives, which improved the acceptability of proxy report. A new proxy-report version of ASCOT was developed with evidence of its qualitative content validity and acceptability. The ASCOT-Proxy is ready for empirical testing of its suitability for data collection as a self-completion and/or interview questionnaire, and also

  4. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  5. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    Science.gov (United States)

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  6. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  7. Developing Climate Resilience Toolkit Decision Support Training Sectio

    Science.gov (United States)

    Livezey, M. M.; Herring, D.; Keck, J.; Meyers, J. C.

    2014-12-01

    The Climate Resilience Toolkit (CRT) is a Federal government effort to address the U.S. President's Climate Action Plan and Executive Order for Climate Preparedness. The toolkit will provide access to tools and products useful for climate-sensitive decision making. To optimize the user experience, the toolkit will also provide access to training materials. The National Oceanic and Atmospheric Administration (NOAA) has been building a climate training capability for 15 years. The target audience for the training has historically been mainly NOAA staff with some modified training programs for external users and stakeholders. NOAA is now using this climate training capacity for the CRT. To organize the CRT training section, we collaborated with the Association of Climate Change Officers to determine the best strategy and identified four additional complimentary skills needed for successful decision making: climate literacy, environmental literacy, risk assessment and management, and strategic execution and monitoring. Developing the climate literacy skills requires knowledge of climate variability and change, as well as an introduction to the suite of available products and services. For the development of an environmental literacy category, specific topics needed include knowledge of climate impacts on specific environmental systems. Climate risk assessment and management introduces a process for decision making and provides knowledge on communication of climate information and integration of climate information in planning processes. The strategic execution and monitoring category provides information on use of NOAA climate products, services, and partnership opportunities for decision making. In order to use the existing training modules, it was necessary to assess their level of complexity, catalog them, and develop guidance for users on a curriculum to take advantage of the training resources to enhance their learning experience. With the development of this CRT

  8. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  9. RAVE-a Detector-independent vertex reconstruction toolkit

    International Nuclear Information System (INIS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-01-01

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available

  10. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Science.gov (United States)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  11. RAVE-a Detector-independent vertex reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, Wolfgang [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at; Mitaroff, Winfried; Moser, Fabian [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)

    2007-10-21

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  12. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, W; Mitaroff, W; Moser, F; Pflugfelder, B; Riedel, H V [Austrian Academy of Sciences, Institute of High Energy Physics, A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at

    2008-07-15

    A detector-independent toolkit for vertex reconstruction (RAVE{sup 1}) is being developed, along with a standalone framework (VERTIGO{sup 2}) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  13. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  14. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...

  15. A simulation toolkit for electroluminescence assessment in rare event experiments

    CERN Document Server

    Oliveira, C A B; Veenhof, R; Biagi, S; Monteiro, C M B; Santos, J M F dos; Ferreira, A L; Veloso, J F C A

    2011-01-01

    A good understanding of electroluminescence is a prerequisite when optimising double-phase noble gas detectors for Dark Matter searches and high-pressure xenon TPCs for neutrinoless double beta decay detection. A simulation toolkit for calculating the emission of light through electron impact on neon, argon, krypton and xenon has been developed using the Magboltz and Garfield programs. Calculated excitation and electroluminescence efficiencies, electroluminescence yield and associated statistical fluctuations are presented as a function of electric field. Good agreement with experiment and with Monte Carlo simulations has been obtained.

  16. The Populist Toolkit : Finnish Populism in Action 2007–2016

    OpenAIRE

    Ylä-Anttila, Tuukka

    2017-01-01

    Populism has often been understood as a description of political parties and politicians, who have been labelled either populist or not. This dissertation argues that it is more useful to conceive of populism in action: as something that is done rather than something that is. I propose that the populist toolkit is a collection of cultural practices, which politicians and citizens use to make sense of and do politics, by claiming that ‘the people’ are opposed by a corrupt elite – a powerful cl...

  17. The Multiple-Patient Simulation Toolkit: Purpose, Process, and Pilot.

    Science.gov (United States)

    Beroz, Sabrina; Sullivan, Nancy; Kramasz, Vanessa; Morgan, Patricia

    Educating nursing students to safely care for multiple patients has become an important but challenging focus for nurse educators. New graduate nurses are expected to manage care for multiple patients in a complex and multifaceted health care system. With patient safety as a priority, multiple-patient assignments are necessary in order for nursing students to learn how to effectively prioritize and delegate care. The purpose of this project was the construction of an adaptable and flexible template for the development of multiple-patient simulations. Through utilization, the template moved to a toolkit adding an operational guide, sample-populated template, and bibliography.

  18. Object Toolkit Version 4.3 User’s Manual

    Science.gov (United States)

    2016-12-31

    and with Nascap-2k. See the EPIC and Nascap-2k manuals for instructions. Most of the difficulties that users have encountered with Object Toolkit are...4/icond). 12.3 Importing Components From a NX I-DEAS TMG ASCII VUFF File Users of the NX I-DEAS TMG thermal analysis program can import the ASCII...2k user interface. The meaning of these properties is discussed in the Nascap-2k User’s Manual . Figure 36. Detector Properties Dialog Box. 15.5

  19. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  20. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  1. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  2. Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.

    Science.gov (United States)

    Beehler, Gregory P; Lilienthal, Kaitlin R

    2017-02-01

    The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Guest editors' introduction to the 4th issue of Experimental Software and Toolkits (EST-4)

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Kienle, H.M.; Mens, K.

    2014-01-01

    Experimental software and toolkits play a crucial role in computer science. Elsevier’s Science of Computer Programming special issues on Experimental Software and Toolkits (EST) provide a means for academic tool builders to get more visibility and credit for their work, by publishing a paper along

  4. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    Science.gov (United States)

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  5. Heart Failure: Self-care to Success: Development and evaluation of a program toolkit.

    Science.gov (United States)

    Bryant, Rebecca

    2017-08-17

    The Heart Failure: Self-care to Success toolkit was developed to assist NPs in empowering patients with heart failure (HF) to improve individual self-care behaviors. This article details the evolution of this toolkit for NPs, its effectiveness with patients with HF, and recommendations for future research and dissemination strategies.

  6. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  7. A Teacher Tablet Toolkit to meet the challenges posed by 21st century rural teaching and learning environments

    Directory of Open Access Journals (Sweden)

    Adèle Botha

    2015-11-01

    Full Text Available This article draws upon the experiences gained in participating in an Information and Communication Technology for Rural Education (ICT4RED initiative, as part of a larger Technology for Rural Education project (TECH4RED in Cofimvaba in the Eastern Cape Province of South Africa. The aim of this paper is to describe the conceptualisation, design and application of an innovative teacher professional development course for rural teachers, enabling them to use tablets to support teaching and learning in their classrooms. The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design Science Research Process, to be incorporated in the ICT4RED Teacher Professional Development Course. The resulting course aims to equip participating teachers with a toolkit consisting of technology hardware, pragmatic pedagogical and technology knowledge and skills, and practice based experience. The significance of game design elements such as simulation and fun, technology in need rather than in case, adequate scaffolding and a clear learning path with interim learning goals are noted.

  8. A universal postprocessing toolkit for accelerator simulation and data analysis

    International Nuclear Information System (INIS)

    Borland, M.

    1998-01-01

    The Self-Describing Data Sets (SDDS) toolkit comprises about 70 generally-applicable programs sharing a common data protocol. At the Advanced Photon Source (APS), SDDS performs the vast majority of operational data collection and processing, most data display functions, and many control functions. In addition, a number of accelerator simulation codes use SDDS for all post-processing and data display. This has three principle advantages: first, simulation codes need not provide customized post-processing tools, thus simplifying development and maintenance. Second, users can enhance code capabilities without changing the code itself, by adding SDDS-based pre- and post-processing. Third, multiple codes can be used together more easily, by employing SDDS for data transfer and adaptation. Given its broad applicability, the SDDS file protocol is surprisingly simple, making it quite easy for simulations to generate SDDS-compliant data. This paper discusses the philosophy behind SDDS, contrasting it with some recent trends, and outlines the capabilities of the toolkit. The paper also gives examples of using SDDS for accelerator simulation

  9. Innovations in oral health: A toolkit for interprofessional education.

    Science.gov (United States)

    Dolce, Maria C; Parker, Jessica L; Werrlein, Debra T

    2017-05-01

    The integration of oral health competencies into non-dental health professions curricula can serve as an effective driver for interprofessional education (IPE). The purpose of this report is to describe a replicable oral-health-driven IPE model and corresponding online toolkit, both of which were developed as part of the Innovations in Oral Health (IOH): Technology, Instruction, Practice, and Service programme at Bouvé College of Health Sciences, Northeastern University, USA. Tooth decay is a largely preventable disease that is connected to overall health and wellness, and it affects the majority of adults and a fifth of children in the United States. To prepare all health professionals to address this problem, the IOH model couples programming from the online resource Smiles for Life: A National Oral Health Curriculum with experiential learning opportunities designed for undergraduate and graduate students that include simulation-learning (technology), hands-on workshops and didactic sessions (instruction), and opportunities for both cooperative education (practice) and community-based learning (service). The IOH Toolkit provides the means for others to replicate portions of the IOH model or to establish a large-scale IPE initiative that will support the creation of an interprofessional workforce-one equipped with oral health competencies and ready for collaborative practice.

  10. Determination of Equine Cytochrome c Backbone Amide Hydrogen/Deuterium Exchange Rates by Mass Spectrometry Using a Wider Time Window and Isotope Envelope.

    Science.gov (United States)

    Hamuro, Yoshitomo

    2017-03-01

    A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.

  11. H12: Examination of safety assessment aims, procedures and results from a wider perspective

    International Nuclear Information System (INIS)

    Neall, F.B; Smith, P.A.

    2004-04-01

    Safety assessment (SA) are a familiar tool for the evaluation of disposal concepts for radioactive waste. There is, however, often confusion in the wider community about the aims, methods and results used in SA. This report aims to present the H12 SA in a way that makes the assessment process clearer and the implications of the results more meaningful both to workers within the SA field and to a wider technical audience. The reasonableness of the assessment results, the quality of the models and databases and redundancy within the natural and engineered barrier system have been considered. A number of recent and somewhat older SAs that address a range of different waste types, host rocks and disposal concepts have been considered, and comparisons made to H12. A further aim is to put both doses and timescales in a more meaningful context. It has been necessary to: consider ways of demonstrating the meaningfulness of calculations that give results for many thousands of years in the future; provide a framework timescale as a context for SA results over long times; demonstrate the smallness of the risk associated with the doses by comparison with other radiological and non-radiological risks. The perception of risk, which is a critical issue for public acceptance of radioactive waste disposal and must be considered when seeking to present safety assessment results 'in perspective' to a wider audience, is also discussed. It is concluded that H12 is comparable in many ways to assessments carried out internationally. Some assumptions are somewhat arbitrary reflecting the generic stage of the Japanese programme, and are likely to become better founded in future exercises. Nevertheless, H12 provides a clear and well-founded message that it is feasible to site and construct a safe repository from HLW in Japan. (author)

  12. Advanced processing and simulation of MRS data using the FID appliance (FID-A)-An open source, MATLAB-based toolkit.

    Science.gov (United States)

    Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie

    2017-01-01

    To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  13. HBIM and augmented information: towards a wider user community of image and range-based reconstructions

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F.

    2015-08-01

    This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes). The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.

  14. HBIM and augmented information: towards a wider user community of image and range-based reconstructions

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2015-08-01

    Full Text Available This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes. The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.

  15. Systemic Planning: Dealing with Complexity by a Wider Approach to Planning

    DEFF Research Database (Denmark)

    Leleur, Steen

    2005-01-01

    and methodology that can be helpful for planning under circumstances characterised by complexity and uncertainty. It is argued that compared to conventional, planning – referred to as systematic planning - there is a need for a wider, more systemic approach to planning that is better suited to current real......On the basis of a new book Systemic Planning this paper addresses systems thinking and complexity in a context of planning. Specifically, renewal of planning thinking on this background is set out as so-called systemic planning (SP). The principal concern of SP is to provide principles...

  16. 76 FR 19380 - Notice of Entry Into Effect of MARPOL Annex V Wider Caribbean Region Special Area

    Science.gov (United States)

    2011-04-07

    ... Effect of MARPOL Annex V Wider Caribbean Region Special Area AGENCY: Coast Guard, DHS. ACTION: Notice. SUMMARY: The Coast Guard announces the date for the entry into effect of discharge requirements from ships in the Wider Caribbean Region (WCR) special area (SA) as specified in the International Convention...

  17. The Wider Impacts of Universities: Habermas on Learning Processes and Universities

    Directory of Open Access Journals (Sweden)

    Jesper Eckhardt Larsen

    2013-06-01

    Full Text Available The discourse of reform in higher education tends to focus narrowly on employability and the relationship between higher education and the labor market. Universities as research institutions are now considered solely in the dominant discourse of innovation. This way of conceiving universities is inspired by functionalist theory that focuses on the imperatives of a knowledge economy. Taking a departure in the theory of society developed by Jürgen Habermas this paper seeks to provide a theoretical framework for an empirical comparative analysis on the wider societal impact of universities. It is the argument that the wider impacts of higher education and research at universities must be seen in a more complex vision of modern societies. The paper is thus primarily a re-reading of Habermas’ critique of functionalist views of the university and an application of Habermas’ critique on current issues in the debates on higher education. A special discussion will be taken on issues of the self in view of the current tendencies to regard all education from the standpoint of the economic outputs.

  18. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    Science.gov (United States)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  19. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  20. SeqKit: A Cross-Platform and Ultrafast Toolkit for FASTA/Q File Manipulation.

    Directory of Open Access Journals (Sweden)

    Wei Shen

    Full Text Available FASTA and FASTQ are basic and ubiquitous formats for storing nucleotide and protein sequences. Common manipulations of FASTA/Q file include converting, searching, filtering, deduplication, splitting, shuffling, and sampling. Existing tools only implement some of these manipulations, and not particularly efficiently, and some are only available for certain operating systems. Furthermore, the complicated installation process of required packages and running environments can render these programs less user friendly. This paper describes a cross-platform ultrafast comprehensive toolkit for FASTA/Q processing. SeqKit provides executable binary files for all major operating systems, including Windows, Linux, and Mac OSX, and can be directly used without any dependencies or pre-configurations. SeqKit demonstrates competitive performance in execution time and memory usage compared to similar tools. The efficiency and usability of SeqKit enable researchers to rapidly accomplish common FASTA/Q file manipulations. SeqKit is open source and available on Github at https://github.com/shenwei356/seqkit.

  1. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  2. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  3. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  4. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  5. Prevention literacy: community-based advocacy for access and ownership of the HIV prevention toolkit.

    Science.gov (United States)

    Parker, Richard G; Perez-Brumer, Amaya; Garcia, Jonathan; Gavigan, Kelly; Ramirez, Ana; Milnor, Jack; Terto, Veriano

    2016-01-01

    Critical technological advances have yielded a toolkit of HIV prevention strategies. This literature review sought to provide contextual and historical reflection needed to bridge the conceptual gap between clinical efficacy and community effectiveness (i.e. knowledge and usage) of existing HIV prevention options, especially in resource-poor settings. Between January 2015 and October 2015, we reviewed scholarly and grey literatures to define treatment literacy and health literacy and assess the current need for literacy related to HIV prevention. The review included searches in electronic databases including MEDLINE, PsycINFO, PubMed, and Google Scholar. Permutations of the following search terms were used: "treatment literacy," "treatment education," "health literacy," and "prevention literacy." Through an iterative process of analyses and searches, titles and/or abstracts and reference lists of retrieved articles were reviewed for additional articles, and historical content analyses of grey literature and websites were additionally conducted. Treatment literacy was a well-established concept developed in the global South, which was later partially adopted by international agencies such as the World Health Organization. Treatment literacy emerged as more effective antiretroviral therapies became available. Developed from popular pedagogy and grassroots efforts during an intense struggle for treatment access, treatment literacy addressed the need to extend access to underserved communities and low-income settings that might otherwise be excluded from access. In contrast, prevention literacy is absent in the recent surge of new biomedical prevention strategies; prevention literacy was scarcely referenced and undertheorized in the available literature. Prevention efforts today include multimodal techniques, which jointly comprise a toolkit of biomedical, behavioural, and structural/environmental approaches. However, linkages to community advocacy and mobilization

  6. Integrating surgical robots into the next medical toolkit.

    Science.gov (United States)

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  7. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Chung, D.Y.

    1999-01-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45

  8. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  9. TMVA - Toolkit for Multivariate Data Analysis with ROOT Users guide

    CERN Document Server

    Höcker, A; Tegenfeldt, F; Voss, H; Voss, K; Christov, A; Henrot-Versillé, S; Jachowski, M; Krasznahorkay, A; Mahalalel, Y; Prudent, X; Speckmayer, P

    2007-01-01

    Multivariate machine learning techniques for the classification of data from high-energy physics (HEP) experiments have become standard tools in most HEP analyses. The multivariate classifiers themselves have significantly evolved in recent years, also driven by developments in other areas inside and outside science. TMVA is a toolkit integrated in ROOT which hosts a large variety of multivariate classification algorithms. They range from rectangular cut optimisation (using a genetic algorithm) and likelihood estimators, over linear and non-linear discriminants (neural networks), to sophisticated recent developments like boosted decision trees and rule ensemble fitting. TMVA organises the simultaneous training, testing, and performance evaluation of all these classifiers with a user-friendly interface, and expedites the application of the trained classifiers to the analysis of data sets with unknown sample composition.

  10. The interactive learning toolkit: technology and the classroom

    Science.gov (United States)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  11. Water Security Toolkit User Manual: Version 1.3 | Science ...

    Science.gov (United States)

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  12. The Exoplanet Characterization ToolKit (ExoCTK)

    Science.gov (United States)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  13. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  14. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  15. Organising to Enable Innovation

    DEFF Research Database (Denmark)

    Brink, Tove

    2016-01-01

    The purpose of this conceptual paper is to reveal how organising can enable innovation across organisational layers and organisational units. This approach calls for a cross-disciplinary literature review. The aim is to provide an integrated understanding of innovation in an organisational approach....... The findings reveal a continous organising process between individual/ team creativity and organisational structures/control to enable innovation at firm level. Organising provides a dynamic approach and contains the integrated reconstruction of creativity, structures and boundaries for enhanced balance...... of explorative and exploitative learning in uncertain environments. Shedding light on the cross-disciplinary theories to organise innovation provides a contribution at the firm level to enable innovation....

  16. Multi-Stack Persistent Scatterer Interferometry Analysis in Wider Athens, Greece

    Directory of Open Access Journals (Sweden)

    Ioannis Papoutsis

    2017-03-01

    Full Text Available The wider Athens metropolitan area serves as an interesting setting for conducting geodetic studies. On the one hand, it has a complex regional geotectonic characteristic with several active and blind faults, one of which gave the deadly M w 5.9 Athens earthquake on September 1999. On the other hand, the Greek capital is heavily urbanized, and construction activities have been taking place in the last few decades to address the city’s needs for advanced infrastructures. This work focuses on estimating ground velocities for the wider Athens area in a period spanning two decades, with an extended spatial coverage, increased spatial sampling of the measurements and at high precision. The aim is to deliver to the community a reference geodetic database containing consistent and robust velocity estimates to support further studies for modeling and multi-hazard assessment. The analysis employs advanced persistent scatterer interferometry methods, covering Athens with both ascending and descending ERS-1, ERS-2 and Envisat Synthetic Aperture Radar data, forming six independent interferometric stacks. A methodology is developed and applied to exploit track diversity for decomposing the actual surface velocity field to its vertical and horizontal components and coping with the post-processing of the multi-track big data. Results of the time series analysis reveal that a large area containing the Kifisia municipality experienced non-linear motion; while it had been subsiding in the period 1992–1995 (−12 mm/year, the same area has been uplifting since 2005 (+4 mm/year. This behavior is speculated to have its origin on the regional water extraction activities, which when halted, led to a physical restoration phase of the municipality. In addition, a zoom in the area inflicted by the 1999 earthquake shows that there were zones of counter-force horizontal movement prior to the event. Further analysis is suggested to investigate the source and tectonic

  17. The Nuclear Energy Advanced Modeling and Simulation Safeguards and Separations Reprocessing Plant Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    McCaskey, Alex [ORNL; Billings, Jay Jay [ORNL; de Almeida, Valmor F [ORNL

    2011-08-01

    This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usability and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.

  18. A cost effective and high fidelity fluoroscopy simulator using the Image-Guided Surgery Toolkit (IGSTK)

    Science.gov (United States)

    Gong, Ren Hui; Jenkins, Brad; Sze, Raymond W.; Yaniv, Ziv

    2014-03-01

    The skills required for obtaining informative x-ray fluoroscopy images are currently acquired while trainees provide clinical care. As a consequence, trainees and patients are exposed to higher doses of radiation. Use of simulation has the potential to reduce this radiation exposure by enabling trainees to improve their skills in a safe environment prior to treating patients. We describe a low cost, high fidelity, fluoroscopy simulation system. Our system enables operators to practice their skills using the clinical device and simulated x-rays of a virtual patient. The patient is represented using a set of temporal Computed Tomography (CT) images, corresponding to the underlying dynamic processes. Simulated x-ray images, digitally reconstructed radiographs (DRRs), are generated from the CTs using ray-casting with customizable machine specific imaging parameters. To establish the spatial relationship between the CT and the fluoroscopy device, the CT is virtually attached to a patient phantom and a web camera is used to track the phantom's pose. The camera is mounted on the fluoroscope's intensifier and the relationship between it and the x-ray source is obtained via calibration. To control image acquisition the operator moves the fluoroscope as in normal operation mode. Control of zoom, collimation and image save is done using a keypad mounted alongside the device's control panel. Implementation is based on the Image-Guided Surgery Toolkit (IGSTK), and the use of the graphics processing unit (GPU) for accelerated image generation. Our system was evaluated by 11 clinicians and was found to be sufficiently realistic for training purposes.

  19. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  20. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  1. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  2. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health ® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Field tests of a participatory ergonomics toolkit for Total Worker Health

    Science.gov (United States)

    Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2018-01-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and team-work skills of participants. PMID:28166897

  4. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  5. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  6. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    Science.gov (United States)

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  7. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  8. Enabling Persistent Peace After Negotiated Settlements

    Science.gov (United States)

    2016-12-01

    ascend in your military career . I pray our paths will cross again someday professionally; however, I am sure my family and I will be visiting you...232 Miroslav Feix, “Game Theory Toolkit and Workbook for Defense Analysis Students” (master’s thesis, Naval...Research 41, no 3, (May 2004): 275–371. 119 Feix, Miroslav. “Game Theory Toolkit and Workbook for Defense Analysis Students.” Master’s thesis. Naval

  9. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Slaug, Bjørn; Brandt, Åse

    2010-01-01

    This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients and their home environments. The instrument was translated...... from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland and Iceland. This iterative process involved occupational therapists, architects, building engineers and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently from each other, collected data from 106 cases by means of the Nordic Housing...

  10. Pilot project as enabler?

    DEFF Research Database (Denmark)

    Neisig, Margit; Glimø, Helle; Holm, Catrine Granzow

    This article deals with a systemic perspective on transition. The field of study addressed is a pilot project as enabler of transition in a highly complex polycentric context. From a Luhmannian systemic approach, a framework is created to understand and address barriers of change occurred using...... pilot projects as enabler of transition. Aspects of how to create trust and deal with distrust during a transition are addressed. The transition in focus is the concept of New Public Management and how it is applied in the management of the Employment Service in Denmark. The transition regards...

  11. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  12. Socio-technical and organizational challenges to wider e-Health implementation.

    Science.gov (United States)

    Vitacca, M; Mazzù, M; Scalvini, S

    2009-01-01

    Recent advances in information communication technology allow contact with patients at home through e-Health services (telemedicine, in particular). We provide insights on the state of the art of e-Health and telemedicine for possible wider future clinical use. Telemedicine opportunities are summarized as i) home telenursing, ii) electronic transfer to specialists and hospitals, iii) teleconsulting between general practitioners and specialists and iv) call centres activities and online health. At present, a priority action of the EU is the Initiative on TM for chronic disease management as home health monitoring and the future Vision for Europe 2020 is based on development of Integrated Telemedicine Services. There are pros and cons in e-Health and telemedicine. Benefits can be classified as benefits for i) citizens, patients and caregivers and ii) health care provider organizations. Institutions and individuals that play key roles in the future of e-Health are doctors, patients and hospitals, while the whole system should be improved at three crucial levels: 1) organizational, 2) regulatory and 3) technological. Quality, access and efficiency are the general key issues for the success of e-Health and telemedicine implementation. The real technology is the human resource available into the organizations. For e-Health and telemedicine to grow, it will be necessary to investigate their long-term efficacy, cost effectiveness, possible improvement in quality of life and impact on public health burden.

  13. Is a wider angle of the membranous urethra associated with incontinence after radical prostatectomy?

    Science.gov (United States)

    Soljanik, Irina; Bauer, Ricarda M; Becker, Armin J; Stief, Christian G; Gozzi, Christian; Solyanik, Olga; Brocker, Kerstin A; Kirchhoff, Sonja M

    2014-12-01

    To investigate whether differences in the anatomy and dynamics of the pelvic floor (PF) in patients after radical prostatectomy (RP) depicted on magnetic resonance imaging (MRI) are associated with continence status. In the prospective designed study, 24 patients with post-prostatectomy stress urinary incontinence were enrolled. Additionally, 10 continent patients after RP were matched for age, body mass index and perioperative parameters. All patients underwent continence assessment and MRI (TrueFISP sequence; TR 4.57 ms; TE 2.29 ms; slice thickness 7 mm; FOV 270 mm) 12 months after RP. Images were analyzed for membranous urethra length (MUL), angle of the membranous urethra (AMU), severity of periurethral/urethral fibrosis, lifting of the levator ani muscle, lowering of the posterior bladder wall (BPW), bladder neck (BN) and external urinary sphincter (EUS), and symphyseal rotation of these structures during the Valsalva maneuver and voiding. Compared to continent controls, incontinent patients showed a significant wider AMU during voiding (p = 0.002) and more pronounced lowering of the BN and EUS (p urethra as a result of anchoring of the BN and EUS in the PF appears to be an important functional factor with an essential impact on continence after RP. Functional MRI seems to be a helpful imaging tool for morphologic and dynamic evaluation of the PF.

  14. Ambient air quality at the wider area of an industrial mining facility at Stratoni, Chalkidiki, Greece.

    Science.gov (United States)

    Gaidajis, Georgios; Angelakoglou, Komninos; Gazea, Emmy

    2012-01-01

    To assess ambient air quality at the wider area of a mining-industrial facility in Chalkidiki, Greece, the particulate matter with an aerodynamic diameter of 10 μm (PM(10)) and its content in characteristic elements, i.e., As, Cd, Cu, Fe, Mn, Pb, Zn were monitored for a period of three years (2008-2010). Gravimetric air samplers were employed for the particulate matter sampling at three sampling stations located in the immediate vicinity of the industrial facility and at a neighbouring residential site. Monitoring data indicated that the 3-year median PM(10) concentrations were 23.3 μg/m(3) at the residential site close to the facility and 28.7 μg/m(3) at the site within the facility indicating a minimal influence from the industrial activities to the air quality of the neighbouring residential area. Both annual average and median PM(10) concentration levels were below the indicative European standards, whereas similar spatial and temporal variation was observed for the PM(10) constituents. The average Pb concentrations measured for the three sampling sites were 0.2, 0.146 and 0.174 μg/m(3) respectively, well below the indicative limit of 0.5 μg/m(3). The quantitative and qualitative comparison of PM(10) concentrations and its elemental constituent for the three sampling stations did not indicate any direct influence of the mining-industrial activities to the air quality of the Stratoni residential area.

  15. Enabling distributed collaborative science

    DEFF Research Database (Denmark)

    Hudson, T.; Sonnenwald, Diane H.; Maglaughlin, K.

    2000-01-01

    To enable collaboration over distance, a collaborative environment that uses a specialized scientific instrument called a nanoManipulator is evaluated. The nanoManipulator incorporates visualization and force feedback technology to allow scientists to see, feel, and modify biological samples bein...

  16. The Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, T.; Nygren, C.; Slaug, B.

    2014-01-01

    This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument was transla......This study addresses development of a content-valid cross-Nordic version of the Housing Enabler and investigation of its inter-rater reliability when used in occupational therapy rating situations, involving occupational therapists, clients, and their home environments. The instrument...... was translated from the original Swedish version of the Housing Enabler, and adapted according to accessibility norms and guidelines for housing design in Sweden, Denmark, Finland, and Iceland. This iterative process involved occupational therapists, architects, building engineers, and professional translators......, resulting in the Nordic Housing Enabler. For reliability testing, the sampling strategy and data collection procedures used were the same in all countries. Twenty voluntary occupational therapists, pair-wise but independently of each other, collected data from 106 cases by means of the Nordic Housing...

  17. Web-Altairis: An Internet-Enabled Ground System

    Science.gov (United States)

    Miller, Phil; Coleman, Jason; Gemoets, Darren; Hughes, Kevin

    2000-01-01

    This paper describes Web-Altairis, an Internet-enabled ground system software package funded by the Advanced Automation and Architectures Branch (Code 588) of NASA's Goddard Space Flight Center. Web-Altairis supports the trend towards "lights out" ground systems, where the control center is unattended and problems are resolved by remote operators. This client/server software runs on most popular platforms and provides for remote data visualization using the rich functionality of the VisAGE toolkit. Web-Altairis also supports satellite commanding over the Internet. This paper describes the structure of Web-Altairis and VisAGE, the underlying technologies, the provisions for security, and our experiences in developing and testing the software.

  18. A Gateway MultiSite recombination cloning toolkit.

    Directory of Open Access Journals (Sweden)

    Lena K Petersen

    Full Text Available The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org.

  19. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  20. Using the Model Coupling Toolkit to couple earth system models

    Science.gov (United States)

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  1. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  2. Toolkit for data reduction to tuples for the ATLAS experiment

    International Nuclear Information System (INIS)

    Snyder, Scott; Krasznahorkay, Attila

    2012-01-01

    The final step in a HEP data-processing chain is usually to reduce the data to a ‘tuple’ form which can be efficiently read by interactive analysis tools such as ROOT. Often, this is implemented independently by each group analyzing the data, leading to duplicated effort and needless divergence in the format of the reduced data. ATLAS has implemented a common toolkit for performing this processing step. By using tools from this package, physics analysis groups can produce tuples customized for a particular analysis but which are still consistent in format and vocabulary with those produced by other physics groups. The package is designed so that almost all the code is independent of the specific form used to store the tuple. The code that does depend on this is grouped into a set of small backend packages. While the ROOT backend is the most used, backends also exist for HDF5 and for specialized databases. By now, the majority of ATLAS analyses rely on this package, and it is an important contributor to the ability of ATLAS to rapidly analyze physics data.

  3. First responder tracking and visualization for command and control toolkit

    Science.gov (United States)

    Woodley, Robert; Petrov, Plamen; Meisinger, Roger

    2010-04-01

    In order for First Responder Command and Control personnel to visualize incidents at urban building locations, DHS sponsored a small business research program to develop a tool to visualize 3D building interiors and movement of First Responders on site. 21st Century Systems, Inc. (21CSI), has developed a toolkit called Hierarchical Grid Referenced Normalized Display (HiGRND). HiGRND utilizes three components to provide a full spectrum of visualization tools to the First Responder. First, HiGRND visualizes the structure in 3D. Utilities in the 3D environment allow the user to switch between views (2D floor plans, 3D spatial, evacuation routes, etc.) and manually edit fast changing environments. HiGRND accepts CAD drawings and 3D digital objects and renders these in the 3D space. Second, HiGRND has a First Responder tracker that uses the transponder signals from First Responders to locate them in the virtual space. We use the movements of the First Responder to map the interior of structures. Finally, HiGRND can turn 2D blueprints into 3D objects. The 3D extruder extracts walls, symbols, and text from scanned blueprints to create the 3D mesh of the building. HiGRND increases the situational awareness of First Responders and allows them to make better, faster decisions in critical urban situations.

  4. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  5. Rapid parameterization of small molecules using the Force Field Toolkit.

    Science.gov (United States)

    Mayne, Christopher G; Saam, Jan; Schulten, Klaus; Tajkhorshid, Emad; Gumbart, James C

    2013-12-15

    The inability to rapidly generate accurate and robust parameters for novel chemical matter continues to severely limit the application of molecular dynamics simulations to many biological systems of interest, especially in fields such as drug discovery. Although the release of generalized versions of common classical force fields, for example, General Amber Force Field and CHARMM General Force Field, have posited guidelines for parameterization of small molecules, many technical challenges remain that have hampered their wide-scale extension. The Force Field Toolkit (ffTK), described herein, minimizes common barriers to ligand parameterization through algorithm and method development, automation of tedious and error-prone tasks, and graphical user interface design. Distributed as a VMD plugin, ffTK facilitates the traversal of a clear and organized workflow resulting in a complete set of CHARMM-compatible parameters. A variety of tools are provided to generate quantum mechanical target data, setup multidimensional optimization routines, and analyze parameter performance. Parameters developed for a small test set of molecules using ffTK were comparable to existing CGenFF parameters in their ability to reproduce experimentally measured values for pure-solvent properties (<15% error from experiment) and free energy of solvation (±0.5 kcal/mol from experiment). Copyright © 2013 Wiley Periodicals, Inc.

  6. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  7. A cosmology forecast toolkit — CosmoLib

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhiqi, E-mail: zqhuang@cita.utoronto.ca [CEA, Institut de Physique Théorique, Orme des Merisiers, Saint-Aubin, 91191 Gif-sur-Yvette Cédex (France)

    2012-06-01

    The package CosmoLib is a combination of a cosmological Boltzmann code and a simulation toolkit to forecast the constraints on cosmological parameters from future observations. In this paper we describe the released linear-order part of the package. We discuss the stability and performance of the Boltzmann code. This is written in Newtonian gauge and including dark energy perturbations. In CosmoLib the integrator that computes the CMB angular power spectrum is optimized for a l-by-l brute-force integration, which is useful for studying inflationary models predicting sharp features in the primordial power spectrum of metric fluctuations. As an application, CosmoLib is used to study the axion monodromy inflation model that predicts cosine oscillations in the primordial power spectrum. In contrast to the previous studies by Aich et al. and Meerburg et al., we found no detection or hint of the osicllations. We pointed out that the CAMB code modified by Aich et al. does not have sufficient numerical accuracy. CosmoLib and its documentation are available at http://www.cita.utoronto.ca/∼zqhuang/CosmoLib.

  8. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    Science.gov (United States)

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  9. Wieder, wider, weiden: casos de parodia y autoparodia en la narrativa Roberto Bolaño

    Directory of Open Access Journals (Sweden)

    Felipe Adrián Ríos Baeza

    2014-07-01

    Full Text Available Si bien el término parodia ha sido estudiado y signado como un género perdido de la antigüedad, la literatura contemporánea lo vuelve un procedimiento creativo gracias a una de sus aristas más interesantes: la de reiterar. En una novela determinante para su proyecto literario global, Estrella distante (1996, el chileno Roberto Bolaño (1953-2003 parece estar cifrando su obra con esta clave. Es decir, la mencionada novela no sólo estaría contado, en términos textuales, la historia de un piloto de la Fuerza Aérea Chilena llamado Carlos Wieder que, a un tiempo, es un artista y un asesino; sino que en términos transtextuales, y según la explicación de Bibiano O’Ryan (uno de los personajes, el apellido Wieder estaría asociado a una cierta recurrencia en la literatura de Bolaño: la de la parodia como eje mismo, que es simultáneamente burla y reiteración. Wieder, wider, weiden: decir de nuevo, en contra y de manera perversa. La asociación fonética es, asimismo, asociación creativa. Para Bolaño, el espacio de la parodia le permite reiterar, refutar y pervertir ciertos actos que acometen sus personajes, todos ellos ubicados en una suerte de “eterno retorno” que los hace, una y otra vez, asesinar, escribir, hacer el amor, leer, hablar, pensar. Se propone aquí, una hipótesis a comprobar: sólo en el volver a contar, en la repetición de un texto en otro contexto, en la parodia es como realmente se pueden apreciar los énfasis y subrayados que Bolaño desea hacer notorios en su propuesta literaria. Even though the term parody has been studied and identified as a lost genre from older times, contemporary literature has turned it into a creative procedure thanks to one of its most interesting facets: reiteration. In a determinant novel for his global literary project: Estrella distante (1996, the Chilean writer Roberto Bolaño (1953-2003 seems to encode his work in this key. In other words, the novel would not only narrate, in

  10. Spatially enabled land administration

    DEFF Research Database (Denmark)

    Enemark, Stig

    2006-01-01

    enabling of land administration systems managing tenure, valuation, planning, and development will allow the information generated by these activities to be much more useful. Also, the services available to private and public sectors and to community organisations should commensurably improve. Knowledge....... In other words: Good governance and sustainable development is not attainable without sound land administration or - more broadly – sound land management. The paper presents a land management vision that incorporates the benefits of ICT enabled land administration functions. The idea is that spatial...... the communication between administrative systems and also establish more reliable data due to the use the original data instead of copies. In Denmark, such governmental guidelines for a service-oriented ITarchitecture in support of e-government are recently adopted. Finally, the paper presents the role of FIG...

  11. Nordic Housing Enabler

    DEFF Research Database (Denmark)

    Helle, Tina; Brandt, Åse

    Development and reliability testing of the Nordic Housing Enabler – an instrument for accessibility assessment of the physical housing. Tina Helle & Åse Brandt University of Lund, Health Sciences, Faculty of Medicine (SE) and University College Northern Jutland, Occupational Therapy department (DK......). Danish Centre for Assistive Technology. Abstract. For decades, accessibility to the physical housing environment for people with functional limitations has been of interest politically, professionally and for the users. Guidelines and norms on accessible housing design have gradually been developed......, however, the built environment shows serious deficits when it comes to accessibility. This study addresses development of a content valid cross-Nordic version of the Housing Enabler and investigation of inter-rater reliability, when used in occupational therapy practice. The instrument was translated from...

  12. Role of the anesthesiologist in the wider governance of healthcare and health economics.

    Science.gov (United States)

    Martin, Janet; Cheng, Davy

    2013-09-01

    Healthcare resources will always be limited, and as a result, difficult decisions must be made about how to allocate limited resources across unlimited demands in order to maximize health gains per resource expended. Governments and hospitals now in severe financial deficits recognize that reengagement of physicians is central to their ability to contain the runaway healthcare costs. Health economic analysis provides tools and techniques to assess which investments in healthcare provide good value for money vs which options should be forgone. Robust decision-making in healthcare requires objective consideration of evidence in order to balance clinical and economic benefits vs risks. Surveys of the literature reveal very few economic analyses related to anesthesia and perioperative medicine despite increasing recognition of the need. Now is an opportune time for anesthesiologists to become familiar with the tools and methodologies of health economics in order to facilitate and lead robust decision-making in quality-based procedures. For most technologies used in anesthesia and perioperative medicine, the responsibility to determine cost-effectiveness falls to those tasked with the governance and stewardship of limited resources for unlimited demands using best evidence plus economics at the local, regional, and national levels. Applicable cost-effectiveness, cost-utility, and cost-benefits in health economics are reviewed in this article with clinical examples in anesthesia. Anesthesiologists can make a difference in the wider governance of healthcare and health economics if we advance our knowledge and skills beyond the technical to address the "other" dimensions of decision-making--most notably, the economic aspects in a value-based healthcare system.

  13. Visions, beliefs, and transformation: exploring cross-sector and transboundary dynamics in the wider Mekong region

    Directory of Open Access Journals (Sweden)

    Alex Smajgl

    2015-06-01

    Full Text Available Policy and investment decisions in highly connected, developing regions can have implications that extend beyond their initial objectives of national development and poverty reduction. Local level decisions that aim to promote trajectories toward desirable futures are often transformative, unexpectedly altering factors that are determined at higher regional levels. The converse also applies. The ability to realize desirable local futures diminishes if decision-making processes are not coordinated with other influential governance and decision levels. Providing effective support across multiple levels of decision making in a connected, transformative environment requires (a identification and articulation of desired outcomes at the relevant levels of decision making, (b improved understanding of complex cross-scale interactions that link to potentially transforming decisions, and (c learning among decision makers and decision influencers. Research implemented through multiple participatory modalities can facilitate such relevant system learning to contribute to sustainable adaptation pathways. We test application of a systematic policy engagement framework, the Challenge and Reconstruct Learning or ChaRL framework, on a set of interdependent development decisions in the Mekong region. The analysis presented here is focused on the implementations of the ChaRL process in the Nam Ngum River Basin, Lao People's Democratic Republic and the Tonle Sap Lake and environs, Cambodia to exemplify what cross-scale and cross-sectoral insights were generated to inform decision-making processes in the wider Mekong region. The participatory process described aligns the facilitated development of scenarios articulating shared future visions at local and regional levels with agent-based simulations and facilitates learning by contrasting desired outcomes with likely, potentially maladaptive outcomes.

  14. Enabling Wind Power Nationwide

    Energy Technology Data Exchange (ETDEWEB)

    Jose Zayas, Michael Derby, Patrick Gilman and Shreyas Ananthan,

    2015-05-01

    Leveraging this experience, the U.S. Department of Energy’s (DOE’s) Wind and Water Power Technologies Office has evaluated the potential for wind power to generate electricity in all 50 states. This report analyzes and quantifies the geographic expansion that could be enabled by accessing higher above ground heights for wind turbines and considers the means by which this new potential could be responsibly developed.

  15. Use of Remote Sensing Data to Enhance NWS Storm Damage Toolkit

    Science.gov (United States)

    Jedlove, Gary J.; Molthan, Andrew L.; White, Kris; Burks, Jason; Stellman, Keith; Smith, Mathew

    2012-01-01

    In the wake of a natural disaster such as a tornado, the National Weather Service (NWS) is required to provide a very detailed and timely storm damage assessment to local, state and federal homeland security officials. The Post ]Storm Data Acquisition (PSDA) procedure involves the acquisition and assembly of highly perishable data necessary for accurate post ]event analysis and potential integration into a geographic information system (GIS) available to its end users and associated decision makers. Information gained from the process also enables the NWS to increase its knowledge of extreme events, learn how to better use existing equipment, improve NWS warning programs, and provide accurate storm intensity and damage information to the news media and academia. To help collect and manage all of this information, forecasters in NWS Southern Region are currently developing a Storm Damage Assessment Toolkit (SDAT), which incorporates GIS ]capable phones and laptops into the PSDA process by tagging damage photography, location, and storm damage details with GPS coordinates for aggregation within the GIS database. However, this tool alone does not fully integrate radar and ground based storm damage reports nor does it help to identify undetected storm damage regions. In many cases, information on storm damage location (beginning and ending points, swath width, etc.) from ground surveys is incomplete or difficult to obtain. Geographic factors (terrain and limited roads in rural areas), manpower limitations, and other logistical constraints often prevent the gathering of a comprehensive picture of tornado or hail damage, and may allow damage regions to go undetected. Molthan et al. (2011) have shown that high resolution satellite data can provide additional valuable information on storm damage tracks to augment this database. This paper presents initial development to integrate satellitederived damage track information into the SDAT for near real ]time use by forecasters

  16. A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines

    Directory of Open Access Journals (Sweden)

    Cieślik Marcin

    2011-02-01

    Full Text Available Abstract Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'. A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption. An add-on module ('NuBio' facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures and functionality (e.g., to parse/write standard file formats. Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and

  17. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    Science.gov (United States)

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    Directory of Open Access Journals (Sweden)

    Hutchison Geoffrey R

    2008-12-01

    Full Text Available Abstract Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java, have different underlying chemical models and have different application programming interfaces (APIs. Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit.

  19. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  20. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  1. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  2. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    Science.gov (United States)

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  3. Continental Subduction: Mass Fluxes and Interactions with the Wider Earth System

    Science.gov (United States)

    Cuthbert, S. J.

    2011-12-01

    Substantial parts of ultra-high pressure (UHP) terrains probably represent subducted passive continental margins (PCM). This contribution reviews and synthesises research on processes operating in such systems and their implication for the wider Earth system. PCM sediments are large repositories of volatiles including hydrates, nitrogen species, carbonates and hydrocarbons. Sediments and upper/ mid-crustal basement are rich in incompatible elements and are fertile for melting. Lower crust may be more mafic and refractory. Juvenile rift-related mafic rocks also have the potential to generate substantial volumes of granitoid melts, especially if they have been hydrated. Exposed UHP terrains demonstrate the return of continental crust from mantle depths, show evidence for substantial fluxes of aqueous fluid, anatexis and, in entrained orogenic peridotites, metasomatism of mantle rocks by crust- derived C-O-H fluids. However, substantial bodies of continental material may never return to the surface as coherent masses of rock, but remain sequestered in the mantle where they melt or become entrained in the deeper mantle circulation. Hence during subduction, PCM's become partitioned by a range of mechanisms. Mechanical partitioning strips away weaker sediment and middle/upper crust, which circulate back up the subduction channel, while denser, stronger transitional pro-crust and lower crust may "stall" near the base of the lithosphere or be irreversibly subducted to join the global mantle circulation. Under certain conditions sediment and upper crustal basement may reach depths for UHPM. Further partitioning takes place by anatexis, which either aids stripping and exhumation of the more melt-prone rock-masses through mechanical softening, or separates melt from residuum so that melt escapes and is accreted to the upper plate leading to "undercrusting", late-orogenic magmatism and further refinement of the crust. Melt that traverses sections of mantle will interact with

  4. EnableATIS strategy assessment.

    Science.gov (United States)

    2014-02-01

    Enabling Advanced Traveler Information Systems (EnableATIS) is the traveler information component of the Dynamic Mobility Application (DMA) program. The objective of : the EnableATIS effort is to foster transformative traveler information application...

  5. Enabling Digital Literacy

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne

    2010-01-01

    There are some tensions between high-level policy definitions of “digital literacy” and actual teaching practice. We need to find workable definitions of digital literacy; obtain a better understanding of what digital literacy might look like in practice; and identify pedagogical approaches, which...... support teachers in designing digital literacy learning. We suggest that frameworks such as Problem Based Learning (PBL) are approaches that enable digital literacy learning because they provide good settings for engaging with digital literacy. We illustrate this through analysis of a case. Furthermore......, these operate on a meso-level mediating between high-level concepts of digital literacy and classroom practice....

  6. CtOS Enabler

    OpenAIRE

    Crespo Cepeda, Rodrigo; El Yamri El Khatibi, Meriem; Carrera García, Juan Manuel

    2015-01-01

    Las Smart Cities son, indudablemente, el futuro próximo de la tecnología al que nos acercamos cada día, lo que se puede observar en la abundancia de dispositivos móviles entre la población, que informatizan la vida cotidiana mediante el uso de la geolocalización y la información. Pretendemos unir estos dos ámbitos con CtOS Enabler para crear un estándar de uso que englobe todos los sistemas de Smart Cities y facilite a los desarrolladores de dicho software la creación de nuevas herramientas. ...

  7. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    Science.gov (United States)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  8. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  9. Healthcare and wider societal implications of stillbirth: a population-based cost-of-illness study.

    Science.gov (United States)

    Campbell, H E; Kurinczuk, J J; Heazell, Aep; Leal, J; Rivero-Arias, O

    2018-01-01

    To extend previous work and estimate health and social care costs, litigation costs, funeral-related costs, and productivity losses associated with stillbirth in the UK. A population-based cost-of-illness study using a synthesis of secondary data. The National Health Service (NHS) and wider society in the UK. Stillbirths occurring within a 12-month period and subsequent events occurring over the following 2 years. Costs were estimated using published data on events, resource use, and unit costs. Mean health and social care costs, litigation costs, funeral-related costs, and productivity costs for 2 years, reported for a single stillbirth and at a national level. Mean health and social care costs per stillbirth were £4191. Additionally, funeral-related costs were £559, and workplace absence (parents and healthcare professionals) was estimated to cost £3829 per stillbirth. For the UK, the annual health and social care costs were estimated at £13.6 million, and total productivity losses amounted to £706.1 million (98% of this cost was attributable to the loss of the life of the baby). The figures for total productivity losses were sensitive to the perspective adopted about the loss of life of the baby. This work expands the current intelligence on the costs of stillbirth beyond the health service to costs for parents and society, and yet these additional findings must still be regarded as conservative estimates of the true economic costs. The costs of stillbirth are significant, affecting the health service, parents, professionals, and society. Why and how was the study carried out? The personal, social, and emotional consequences of stillbirth are profound. Placing a monetary value on such consequences is emotive, yet necessary, when deciding how best to invest limited healthcare resources. We estimated the average costs associated with a single stillbirth and the costs for all stillbirths occurring in the UK over a 1-year period. What were the main

  10. Smart Grid Enabled EVSE

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-01-12

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  11. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  12. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  13. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  14. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  15. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  16. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  17. A survey exploring National Health Service ePrescribing Toolkit use and perceived usefulness amongst English hospitals

    Directory of Open Access Journals (Sweden)

    Kathrin Cresswell

    2017-06-01

    Conclusions: Interactive elements and learning lessons from early adopter sites that had accumulated experiences of implementing systems was viewed as the most helpful aspect of the ePrescribing Toolkit. The Toolkit now needs to be further developed to facilitate the continuing implementation/optimisation of ePrescribing and other health information technology across the NHS.

  18. Transition Toolkit 3.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System. Third Edition

    Science.gov (United States)

    Clark, Heather Griller; Mathur, Sarup; Brock, Leslie; O'Cummings, Mindee; Milligan, DeAngela

    2016-01-01

    The third edition of the National Technical Assistance Center for the Education of Neglected or Delinquent Children and Youth's (NDTAC's) "Transition Toolkit" provides updated information on existing policies, practices, strategies, and resources for transition that build on field experience and research. The "Toolkit" offers…

  19. Facilitating wider application of progesterone RIA for improving livestock production in developing countries

    International Nuclear Information System (INIS)

    Oswin Perera, B.M.B.

    2000-01-01

    Full text: Research and development programmes supported by the Joint FAO/IAEA Division on improving livestock production in developing countries have identified three major biological constraints: feeding, breeding management and diseases. Proper breeding management is important in order to achieve optimum economic benefits (through products such as milk, meat and offspring) from an animal during its lifespan. This requires early attainment of puberty, short intervals from calving to conception, high conception rates and low number of matings or artificial inseminations (Als) per conception. The use of radioimmunoassay (RIA) for measuring progesterone in milk of dairy animals or in blood of meat animals, together with recording of data on reproductive events and production parameters, is an indispensable tool that provides information both on problems in breeding management by farmers as well as deficiencies in the Al services provided to them by government, co-operative or private organizations. This allows appropriate strategies and interventions to be adopted to overcome these limitations. Progesterone RIA can also detect animals that have not conceived by Al within 21 days after mating (early non-pregnancy diagnosis or N-PD), and alert farmers to the need to have these animals closely observed for oestrus and re-inseminated at the appropriate time. In order to ensure the sustained use of RIA technology for progesterone measurement in developing Member States, the IAEA has been engaged in the development and transfer of simple, robust and cheap methods of RIA. The system currently being used is based on a direct (non-extraction) method, using a 125 I-progesterone tracer and a solid-phase separation method (antibody coated tubes). In order to ensure wider availability (and lower cost) of the two key reagents required for the assay, the IAEA has initiated a programme to assist Member States to develop the capability to produce these in selected regional or

  20. Enabling graphene nanoelectronics.

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Wei; Ohta, Taisuke; Biedermann, Laura Butler; Gutierrez, Carlos; Nolen, C. M.; Howell, Stephen Wayne; Beechem Iii, Thomas Edwin; McCarty, Kevin F.; Ross, Anthony Joseph, III

    2011-09-01

    Recent work has shown that graphene, a 2D electronic material amenable to the planar semiconductor fabrication processing, possesses tunable electronic material properties potentially far superior to metals and other standard semiconductors. Despite its phenomenal electronic properties, focused research is still required to develop techniques for depositing and synthesizing graphene over large areas, thereby enabling the reproducible mass-fabrication of graphene-based devices. To address these issues, we combined an array of growth approaches and characterization resources to investigate several innovative and synergistic approaches for the synthesis of high quality graphene films on technologically relevant substrate (SiC and metals). Our work focused on developing the fundamental scientific understanding necessary to generate large-area graphene films that exhibit highly uniform electronic properties and record carrier mobility, as well as developing techniques to transfer graphene onto other substrates.

  1. The Indigenous Experience of Work in a Health Research Organisation: Are There Wider Inferences?

    Directory of Open Access Journals (Sweden)

    Sharon Chirgwin

    2017-08-01

    Full Text Available The purpose of this study was to identify the factors that positively and negatively impacted on the employment experiences and trajectories of Indigenous Australians who are currently or were formerly employed by a research organisation in both remote and urban settings. The study design was an embedded mixed-methods approach. The first phase quantified staff uptake, continued employment, and attrition. Then interviews were conducted with 42 former and 51 current Indigenous staff members to obtain qualitative data. The results showed that the quality of supervision, the work flexibility to enable employees to respond to family and community priorities, and training and other forms of career support were all identified as important factors in the workplace. The most common reasons for leaving were that research projects ended, or to pursue a career change or further study. The authors use the findings to make recommendations pertinent to policy formation for both government and organisations seeking to attract and nurture Indigenous staff.

  2. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study.

    Science.gov (United States)

    Lee, Lisa; Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-10-01

    To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Questionnaire-based survey of attendees at a national ePrescribing symposium. 2013 National ePrescribing Symposium in London, UK. Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals' experiences (n = 45; 64.3%) were considered the most useful types of content. There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning.

  3. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    Science.gov (United States)

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  4. A Hybrid Communications Network Simulation-Independent Toolkit

    National Research Council Canada - National Science Library

    Dines, David M

    2008-01-01

    .... Evolving a grand design of the enabling network will require a flexible evaluation platform to try and select the right combination of network strategies and protocols in the realms of topology control and routing...

  5. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    Science.gov (United States)

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  6. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  7. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  8. The IGUANA interactive graphics toolkit with examples from CMS and D0

    International Nuclear Information System (INIS)

    Alverson, G.; Osborne, I.; Taylor, L.; Tuura, L.

    2001-01-01

    IGUANA (Interactive Graphics for User ANAlysis) is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications, such as data browsers and detector and event visualisation programs. The IGUANA strategy is to use freely available software (e.g. Qt, SoQt, OpenInventor, OpenGL, HEPVis) and package and extend it to provide a general-purpose and experiment-independent toolkit. The authors describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA. The authors demonstrate the use of IGUANA with several applications built for CMS and D0

  9. Improving the fundamentals of care for older people in the acute hospital setting: facilitating practice improvement using a Knowledge Translation Toolkit.

    Science.gov (United States)

    Wiechula, Rick; Kitson, Alison; Marcoionni, Danni; Page, Tammy; Zeitz, Kathryn; Silverston, Heidi

    2009-12-01

    with sufficient flexibility to meet the individual needs of the teams. Conclusions  The range of tools in the KT Toolkit were found to be helpful, but not all tools needed to be used to achieve successful results. Facilitation of the teams was a central feature of the KT Toolkit and allowed clinicians to retain control of their projects; however, finding the balance between structuring the process and enabling teams to maintain ownership and control was an ongoing challenge. Clinicians may not have the requisite skills and experience in basic standard setting, audit and evaluation and it was therefore important to address this throughout the project. In time this builds capacity throughout the organisation. Identifying evidence to support practice is a challenge to clinicians. Evidence-based guidelines often lack specificity and were found to be difficult to assimilate easily into everyday practice. Evidence to inform practice needs to be provided in a variety of forms and formats that allow clinicians to easily identify the source of the evidence and then develop local standards specific to their needs. The work that began with this project will continue - all teams felt that the work was only starting rather than concluding. This created momentum, motivation and greater ownership of improvements at local level. © 2009 The Authors. Journal Compilation © Blackwell Publishing Asia Pty Ltd.

  10. Grid-Enabled Measures

    Science.gov (United States)

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  11. Enabling distributed petascale science

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Bharathi, Shishir; Bresnahan, John

    2007-01-01

    Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science

  12. Enabling immersive simulation.

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Josh (University of California Santa Cruz, Santa Cruz, CA); Mateas, Michael (University of California Santa Cruz, Santa Cruz, CA); Hart, Derek H.; Whetzel, Jonathan; Basilico, Justin Derrick; Glickman, Matthew R.; Abbott, Robert G.

    2009-02-01

    The object of the 'Enabling Immersive Simulation for Complex Systems Analysis and Training' LDRD has been to research, design, and engineer a capability to develop simulations which (1) provide a rich, immersive interface for participation by real humans (exploiting existing high-performance game-engine technology wherever possible), and (2) can leverage Sandia's substantial investment in high-fidelity physical and cognitive models implemented in the Umbra simulation framework. We report here on these efforts. First, we describe the integration of Sandia's Umbra modular simulation framework with the open-source Delta3D game engine. Next, we report on Umbra's integration with Sandia's Cognitive Foundry, specifically to provide for learning behaviors for 'virtual teammates' directly from observed human behavior. Finally, we describe the integration of Delta3D with the ABL behavior engine, and report on research into establishing the theoretical framework that will be required to make use of tools like ABL to scale up to increasingly rich and realistic virtual characters.

  13. Displays enabling mobile multimedia

    Science.gov (United States)

    Kimmel, Jyrki

    2007-02-01

    With the rapid advances in telecommunications networks, mobile multimedia delivery to handsets is now a reality. While a truly immersive multimedia experience is still far ahead in the mobile world, significant advances have been made in the constituent audio-visual technologies to make this become possible. One of the critical components in multimedia delivery is the mobile handset display. While such alternatives as headset-style near-to-eye displays, autostereoscopic displays, mini-projectors, and roll-out flexible displays can deliver either a larger virtual screen size than the pocketable dimensions of the mobile device can offer, or an added degree of immersion by adding the illusion of the third dimension in the viewing experience, there are still challenges in the full deployment of such displays in real-life mobile communication terminals. Meanwhile, direct-view display technologies have developed steadily, and can provide a development platform for an even better viewing experience for multimedia in the near future. The paper presents an overview of the mobile display technology space with an emphasis on the advances and potential in developing direct-view displays further to meet the goal of enabling multimedia in the mobile domain.

  14. Enabling cleanup technology transfer

    International Nuclear Information System (INIS)

    Ditmars, J. D.

    2002-01-01

    Technology transfer in the environmental restoration, or cleanup, area has been challenging. While there is little doubt that innovative technologies are needed to reduce the times, risks, and costs associated with the cleanup of federal sites, particularly those of the Departments of Energy (DOE) and Defense, the use of such technologies in actual cleanups has been relatively limited. There are, of course, many reasons why technologies do not reach the implementation phase or do not get transferred from developing entities to the user community. For example, many past cleanup contracts provided few incentives for performance that would compel a contractor to seek improvement via technology applications. While performance-based contracts are becoming more common, they alone will not drive increased technology applications. This paper focuses on some applications of cleanup methodologies and technologies that have been successful and are illustrative of a more general principle. The principle is at once obvious and not widely practiced. It is that, with few exceptions, innovative cleanup technologies are rarely implemented successfully alone but rather are implemented in the context of enabling processes and methodologies. And, since cleanup is conducted in a regulatory environment, the stage is better set for technology transfer when the context includes substantive interactions with the relevant stakeholders. Examples of this principle are drawn from Argonne National Laboratory's experiences in Adaptive Sampling and Analysis Programs (ASAPs), Precise Excavation, and the DOE Technology Connection (TechCon) Program. The lessons learned may be applicable to the continuing challenges posed by the cleanup and long-term stewardship of radioactive contaminants and unexploded ordnance (UXO) at federal sites

  15. A Teacher Tablet Toolkit to meet the challenges posed by 21st ...

    African Journals Online (AJOL)

    The course, as outcome, is presented as a Teacher Tablet Toolkit, designed to meet the challenges inherent to the 21st century rural technology enhanced teaching and learning environment. The paper documents and motivates design decisions, derived from literature and adapted through three iterations of a Design ...

  16. College Access and Success for Students Experiencing Homelessness: A Toolkit for Educators and Service Providers

    Science.gov (United States)

    Dukes, Christina

    2013-01-01

    This toolkit serves as a comprehensive resource on the issue of higher education access and success for homeless students, including information on understanding homeless students, assisting homeless students in choosing a school, helping homeless students pay for application-related expenses, assisting homeless students in finding financial aid…

  17. University of Central Florida and the American Association of State Colleges and Universities: Blended Learning Toolkit

    Science.gov (United States)

    EDUCAUSE, 2014

    2014-01-01

    The Blended Learning Toolkit supports the course redesign approach, and interest in its openly available clearinghouse of online tools, strategies, curricula, and other materials to support the adoption of blended learning continues to grow. When the resource originally launched in July 2011, 20 AASCU [American Association of State Colleges and…

  18. Development of an Online Toolkit for Measuring Commercial Building Energy Efficiency Performance -- Scoping Study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Na

    2013-03-13

    This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.

  19. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  20. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    Science.gov (United States)

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  1. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    Science.gov (United States)

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…

  2. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  3. The MOLGENIS toolkit : rapid prototyping of biosoftware at the push of a button

    NARCIS (Netherlands)

    Swertz, Morris A.; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K.; Kanterakis, Alexandros; Roos, Erik T.; Lops, Joris; Thorisson, Gudmundur A.; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J.; de Brock, Engbert O.; Jansen, Ritsert C.; Parkinson, Helen

    2010-01-01

    Background: There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly

  4. Cyber security awareness toolkit for national security: an approach to South Africa's cyber security policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives...

  5. Urban Teacher Academy Project Toolkit: A Guide to Developing High School Teaching Career Academies.

    Science.gov (United States)

    Berrigan, Anne; Schwartz, Shirley

    There is an urgent need not only to attract more people into the teaching profession but also to build a more diverse, highly qualified, and culturally sensitive teaching force that can meet the needs of a rapidly changing school-age population. This Toolkit takes best practices from high school teacher academies around the United States and…

  6. Between structures and norms : Assessing tax increment financing for the Dutch spatial planning toolkit

    NARCIS (Netherlands)

    Root, Liz; Van Der Krabben, Erwin; Spit, Tejo

    2015-01-01

    The aim of the paper is to assess the institutional (mis)fit of tax increment financing for the Dutch spatial planning financial toolkit. By applying an institutionally oriented assessment framework, we analyse the interconnectivity of Dutch municipal finance and spatial planning structures and

  7. The Special Educator's Toolkit: Everything You Need to Organize, Manage, and Monitor Your Classroom

    Science.gov (United States)

    Golden, Cindy

    2012-01-01

    Overwhelmed special educators: Reduce your stress and support student success with this practical toolkit for whole-classroom organization. A lifesaver for special educators in any K-12 setting, this book-and-CD set will help teachers expertly manage everything, from schedules and paperwork to student supports and behavior plans. Cindy Golden, a…

  8. DUL Radio: A light-weight, wireless toolkit for sketching in hardware

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2011-01-01

    -mobile prototyping where fast reaction is needed (e.g. in controlling sound). The target audiences include designers, students, artists etc. with minimal programming and hardware skills. This presentation covers our motivations for creating the toolkit, specifications, test results, comparison to related products...

  9. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    Science.gov (United States)

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  10. Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications

    NARCIS (Netherlands)

    Gonsalves, Atish; Ternier, Stefaan; De Vries, Fred; Specht, Marcus

    2013-01-01

    Gonsalves, A., Ternier, S., De Vries, F., & Specht, M. (2012, 16-18 October). Serious games at the UNHCR with ARLearn, a toolkit for mobile and virtual reality applications. Presentation given at the 11th World Conference on Mobile and Contextual Learning (mLearn 2012), Helsinki, Finland.

  11. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    Science.gov (United States)

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Use of Remote Sensing Data to Enhance the National Weather Service (NWS) Storm Damage Toolkit

    Science.gov (United States)

    Jedlovec, Gary; Molthan, Andrew; White, Kris; Burks, Jason; Stellman, Keith; Smith, Matthew

    2012-01-01

    SPoRT is improving the use of near real-time satellite data in response to severe weather events and other diasters. Supported through NASA s Applied Sciences Program. Planned interagency collaboration to support NOAA s Damage Assessment Toolkit, with spinoff opportunities to support other entities such as USGS and FEMA.

  13. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  14. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  15. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  16. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  17. Making Schools the Model for Healthier Environments Toolkit: What It Is

    Science.gov (United States)

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  18. Geodetic, Geologic and Seismic Interdisciplinary Research of Tectonically Caused Movements in the Wider Area of the City of Zagreb

    Science.gov (United States)

    Dapo, A.; Pribicevic, B.; Herak, M.; Prelogovic, E.

    2012-04-01

    Since the last great earthquake in 1880 which shook the Zagreb area with IX° MCS, tectonic movements and models of numerous Zagreb faults have been the focal point of Croatian geologists, seismologists and in the last 15 years also geodetic scientists, who all have been working in the scope of their scientific branches on bringing the light to the tectonic mechanisms in the wider Zagreb area. Since it is tectonically very active area and being the Capitol city of the Croatia with very high population density it is of utmost importance to understand those mechanisms and to according to them find the best possible measures for protecting people and valuables. Best results are certainly going to be achieved through the interdisciplinary approach. That is why this paper presents first interdisciplinary results from geodetic, geologic and seismic researches and their contribution to the collective knowledge about tectonic movements in the wider area of the City of Zagreb.

  19. Designing an Educator Toolkit for the Mobile Learning Age

    Science.gov (United States)

    Burden, Kevin; Kearney, Matthew

    2018-01-01

    Mobile technologies have been described as 'boundary' objects which enable teachers and learners to transcend many of the barriers such as rigid schedules and spaces which have hitherto characterised traditional forms of education. However, educators need to better understand how to design learning scenarios which genuinely exploit the unique…

  20. IOC-UNEP regional workshop to review priorities for marine pollution monitoring, research, control and abatement in the wider Caribbean

    International Nuclear Information System (INIS)

    1989-01-01

    The IOC-UNEP Regional Workshop to Review Priorities for Marine Pollution Monitoring, Research, Control and Abatement in the Wider Caribbean Region (San Jose, 24-30 August 1989) examined a possible general framework for a regionally co-ordinated comprehensive joint IOC/UNEP programme for marine pollution assessment and control in the Wider Caribbean region (CEPPOL). The overall objective of CEPPOL is to establish a regionally co-ordinated comprehensive joint IOC/UNEP Marine Pollution Assessment and Control Programme catering to the immediate and long-term requirements of the Cartagena Convention as well as the requirements of the member States of IOCARIBE. The specific objectives of the programmes are: (i) To organize and carry out a regionally co-ordinated marine pollution monitoring and research programme concentrating on contaminants and pollutants affecting the quality of the marine and coastal environment, as well as the human health in the Wider Caribbean and to interpret/assess the results of the programme as part of the scientific basis for the region; (ii) To generate information on the sources, levels, amounts, trends and effects of marine pollution within the Wider Caribbean region as an additional component of the scientific basis upon which the formulation of proposals for preventive and remedial actions can be based; (iii) To formulate proposals for technical, administrative and legal pollution control, abatement, and preventive measures and to assist the Governments in the region in implementing and evaluating their effectiveness; and (iv) To strengthen and , when necessary, to develop/establish the capabilities of national institutions to carry out marine pollution monitoring and research, as well as to formulate and apply pollution control and abatement measures

  1. Phase 1 Development Report for the SESSA Toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Knowlton, Robert G.; Melton, Brad J; Anderson, Robert J.

    2014-09-01

    operation of th e SESSA tool kit in order to give the user enough information to start using the tool kit . SESSA is currently a prototype system and this documentation covers the initial release of the tool kit . Funding for SESSA was provided by the Department of Defense (D oD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL) . ACKNOWLEDGEMENTS The authors wish to acknowledge the funding support for the development of the Site Exploitation System for Situational Awareness (SESSA) toolkit from the Department of Defense (DoD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL). Special thanks to Mr. Garold Warner, of DFSC, who served as the Project Manager. Individuals that worked on the design, functional attributes, algorithm development, system arc hitecture, and software programming include: Robert Knowlton, Brad Melton, Robert Anderson, and Wendy Amai.

  2. A Relativist's Toolkit, The Mathematics of Black-Hole Mechanics

    International Nuclear Information System (INIS)

    Whiting, B

    2004-01-01

    This new textbook is intended for students familiar with general relativity at the introductory level of Bernard Schutz's book A First Course in General Relativity (1985 Cambridge: Cambridge University Press) and not yet accomplished at the advanced level of Robert Wald's book General Relativity (1984 Chicago, IL: University of Chicago Press), upon which it nevertheless draws rather heavily. What is distinctively new in this book is that it is a real toolkit, and yet it is not short of detailed applications. As such, it is a helpful book to recommend to students making the transition for which it is intended. The idea of a new textbook on general relativity usually delights me, as the field is still changing rapidly. New perspectives find new ways to present old things to new students. They also have totally new things to present to us all, based on the interests of the current research from which they have grown. This new book presents a wealth of useful tools to students in just five, well integrated chapters, starting with a quick review of the fundamentals and ending with an extensive application of general relativity to black hole spacetimes. In his own words, Eric Poisson has striven to present interesting topics and common techniques not adequately covered in readily available existing texts. This has certainly been accomplished, in a synthesis extracted from many sources. Congruences of geodesics, a staple analytical tool, occupy a whole chapter, and in greater depth and clarity than can be found elsewhere. A thorough, and lengthy, presentation on hypersurfaces, including a careful treatment of the null case, carries the author's unique perspective. This treatment of hypersurfaces is put to practical use in the chapter on Lagrangian and Hamiltonian formulations, which also leans on recent quasilocal energy discussions and includes an elegant treatment of the Bondi-Sachs mass in a unified context. Many of us have become familiar with the careful, well

  3. Disease Prevention in the Age of Convergence - the Need for a Wider, Long Ranging and Collaborative Vision

    Directory of Open Access Journals (Sweden)

    Susan L. Prescott

    2014-01-01

    Full Text Available It is time to bring our imagination, creativity and passion to the fore in solving the global challenges of our age. Our global health crisis and the pandemic of noncommunicable diseases (NCDs is clearly rooted in complex modern societal and environmental changes, many of which have effects on developing immune and metabolic responses. It is intimately related to wider environmental challenges. And it is unsurprising that many NCDs share similar risk factors and that many are associated with a rising predisposition for inflammation. Allergy is one of the earliest signs of environmental impact on these biological pathways, and may also offer an early barometer to assess the effects of early interventions. There is dawning awareness of how changing microbial diversity, nutritional patterns, sedentary indoor behaviours and modern pollutants adversely affect early metabolic and immune development, but still much to understand the complexity of these interactions. Even when we do harness the science and technology, these will not provide solutions unless we also address the wider social, cultural and economic determinants of health - addressing the interconnections between human health and the health of our environment. Now more than ever, we need a wider vision and a greater sense of collective responsibility. We need long-range approaches that aim for life long benefits of a ‘healthier start to life’, and stronger cross-sectoral collaborations to prevent disease. We need to give both our hearts and our minds to solving these global issues.

  4. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    Science.gov (United States)

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  5. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    Science.gov (United States)

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; P<.001). Of the 15 teaching skills assessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  6. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    Science.gov (United States)

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  7. SlideToolkit: an assistive toolset for the histological quantification of whole slide images.

    Directory of Open Access Journals (Sweden)

    Bastiaan G L Nelissen

    Full Text Available The demand for accurate and reproducible phenotyping of a disease trait increases with the rising number of biobanks and genome wide association studies. Detailed analysis of histology is a powerful way of phenotyping human tissues. Nonetheless, purely visual assessment of histological slides is time-consuming and liable to sampling variation and optical illusions and thereby observer variation, and external validation may be cumbersome. Therefore, within our own biobank, computerized quantification of digitized histological slides is often preferred as a more precise and reproducible, and sometimes more sensitive approach. Relatively few free toolkits are, however, available for fully digitized microscopic slides, usually known as whole slides images. In order to comply with this need, we developed the slideToolkit as a fast method to handle large quantities of low contrast whole slides images using advanced cell detecting algorithms. The slideToolkit has been developed for modern personal computers and high-performance clusters (HPCs and is available as an open-source project on github.com. We here illustrate the power of slideToolkit by a repeated measurement of 303 digital slides containing CD3 stained (DAB abdominal aortic aneurysm tissue from a tissue biobank. Our workflow consists of four consecutive steps. In the first step (acquisition, whole slide images are collected and converted to TIFF files. In the second step (preparation, files are organized. The third step (tiles, creates multiple manageable tiles to count. In the fourth step (analysis, tissue is analyzed and results are stored in a data set. Using this method, two consecutive measurements of 303 slides showed an intraclass correlation of 0.99. In conclusion, slideToolkit provides a free, powerful and versatile collection of tools for automated feature analysis of whole slide images to create reproducible and meaningful phenotypic data sets.

  8. Innovations and Challenges of Implementing a Glucose Gel Toolkit for Neonatal Hypoglycemia.

    Science.gov (United States)

    Hammer, Denise; Pohl, Carla; Jacobs, Peggy J; Kaufman, Susan; Drury, Brenda

    2018-05-24

    Transient neonatal hypoglycemia occurs most commonly in newborns who are small for gestational age, large for gestational age, infants of diabetic mothers, and late preterm infants. An exact blood glucose value has not been determined for neonatal hypoglycemia, and it is important to note that poor neurologic outcomes can occur if hypoglycemia is left untreated. Interventions that separate mothers and newborns, as well as use of formula to treat hypoglycemia, have the potential to disrupt exclusive breastfeeding. To determine whether implementation of a toolkit designed to support staff in the adaptation of the practice change for management of newborns at risk for hypoglycemia, that includes 40% glucose gel in an obstetric unit with a level 2 nursery will decrease admissions to the Intermediate Care Nursery, and increase exclusive breastfeeding. This descriptive study used a retrospective chart review for pre/postimplementation of the Management of Newborns at Risk for Hypoglycemia Toolkit (Toolkit) using a convenience sample of at-risk newborns in the first 2 days of life to evaluate the proposed outcomes. Following implementation of the Toolkit, at-risk newborns had a clinically but not statistically significant 6.5% increase in exclusive breastfeeding and a clinically but not statistically significant 5% decrease in admissions to the Intermediate Care Nursery. The Toolkit was designed for ease of staff use and to improve outcomes for the at-risk newborn. Future research includes replication at other level 2 and level 1 obstetric centers and investigation into the number of 40% glucose gel doses that can safely be administered.

  9. FOILFEST :community enabled security.

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Judy Hennessey; Johnson, Curtis Martin; Whitley, John B.; Drayer, Darryl Donald; Cummings, John C., Jr. (.,; .)

    2005-09-01

    The Advanced Concepts Group of Sandia National Laboratories hosted a workshop, ''FOILFest: Community Enabled Security'', on July 18-21, 2005, in Albuquerque, NM. This was a far-reaching look into the future of physical protection consisting of a series of structured brainstorming sessions focused on preventing and foiling attacks on public places and soft targets such as airports, shopping malls, hotels, and public events. These facilities are difficult to protect using traditional security devices since they could easily be pushed out of business through the addition of arduous and expensive security measures. The idea behind this Fest was to explore how the public, which is vital to the function of these institutions, can be leveraged as part of a physical protection system. The workshop considered procedures, space design, and approaches for building community through technology. The workshop explored ways to make the ''good guys'' in public places feel safe and be vigilant while making potential perpetrators of harm feel exposed and convinced that they will not succeed. Participants in the Fest included operators of public places, social scientists, technology experts, representatives of government agencies including DHS and the intelligence community, writers and media experts. Many innovative ideas were explored during the fest with most of the time spent on airports, including consideration of the local airport, the Albuquerque Sunport. Some provocative ideas included: (1) sniffers installed in passage areas like revolving door, escalators, (2) a ''jumbotron'' showing current camera shots in the public space, (3) transparent portal screeners allowing viewing of the screening, (4) a layered open/funnel/open/funnel design where open spaces are used to encourage a sense of ''communitas'' and take advantage of citizen ''sensing'' and funnels are technological

  10. Model-based Kinematics Generation for Modular Mechatronic Toolkits

    DEFF Research Database (Denmark)

    Bordignon, Mirko; Schultz, Ulrik Pagh; Støy, Kasper

    2011-01-01

    Modular robots are mechatronic devices that enable the construction of highly versatile and flexible robotic systems whose mechanical structure can be dynamically modified. The key feature that enables this dynamic modification is the capability of the individual modules to connect to each other...... in multiple ways and thus generate a number of different mechanical systems, in contrast with the monolithic, fixed structure of conventional robots. The mechatronic flexibility, however, complicates the development of models and programming abstractions for modular robots, since manually describing...... the Modular Mechatronics Modelling Language (M3L). M3L is a domain-specific language, which can model the kinematic structure of individual robot modules and declaratively describe their possible interconnections, rather than requiring the user to enumerate them in their entirety. From this description, the M...

  11. A physical and engineering study on the irradiation techniques in neutron capture therapy aiming for wider application

    International Nuclear Information System (INIS)

    Sakurai, Y.; Ono, K.; Suzuki, M.; Katoh, I.; Miyatake, S.-I.; Yanagie, H.

    2003-01-01

    The solo-irradiation of thermal neutrons has been applied for brain cancer and malignant melanoma in the boron neutron capture therapy (BNCT) at the medical irradiation facility of Kyoto University Reactor (KUR), from the first clinical trial in 1974. In 1997, after the facility remodeling, the application of the mix-irradiation of thermal and epi-thermal neutrons was started, and the depth dose distribution for brain cancer has been improved in some degree. In 2001, the solo-irradiation of epi-thermal neutrons also started. It is specially mentioned that the application to oral cancers started at the same time. The BNCT clinical trial using epi-thermal neutron irradiation at KUR, amounts to twelve as of March 2003. The seven trials; more than a half of the total trials, are for oral cancers. From this fact, we think that the wider application to the other cancers is required for the future prosperity of BNCT. The cancers applied for BNCT in KUR at the present time, are brain cancer, melanoma and oral cancers, as mentioned above. The cancers, expected to be applied in near future, are liver cancer, pancreas cancer, lung cancer, tongue cancer, breast cancer, etc.. Any cancer is almost incurable by the other therapy including the other radiation therapy. In the wider application of BNCT to these cancers, the dose-distribution control suitable to each cancer and/or each part, is important. The introduction of multi-directional and/or multi-divisional irradiation is also needed. Here, a physical and engineering study using two-dimensional transport calculation and three-dimensional Monte-Carlo simulation for the irradiation techniques in BNCT aiming for wider application is reported

  12. A toolkit for computerized operating procedure of complex industrial systems with IVI-COM technology

    International Nuclear Information System (INIS)

    Zhou Yangping; Dong Yujie; Huang Xiaojing; Ye Jingliang; Yoshikawa, Hidekazu

    2013-01-01

    A human interface toolkit is proposed to help the user develop computerized operating procedure of complex industrial system such as Nuclear Power Plants (NPPs). Coupled with a friendly graphical interface, this integrated tool includes a database, a procedure editor and a procedure executor. A three layer hierarchy is adopted to express the complexity of operating procedure, which includes mission, process and node. There are 10 kinds of node: entrance, exit, hint, manual input, detector, actuator, data treatment, branch, judgment and plug-in. The computerized operating procedure will sense and actuate the actual industrial systems with the interface based on IVI-COM (Interchangeable Virtual Instrumentation-Component Object Model) technology. A prototype system of this human interface toolkit has been applied to develop a simple computerized operating procedure for a simulated NPP. (author)

  13. Falling Less in Kansas: Development of a Fall Risk Reduction Toolkit

    Directory of Open Access Journals (Sweden)

    Teresa S. Radebaugh

    2011-01-01

    Full Text Available Falls are a serious health risk for older adults. But for those living in rural and frontier areas of the USA, the risks are higher because of limited access to health care providers and resources. This study employed a community-based participatory research approach to develop a fall prevention toolkit to be used by residents of rural and frontier areas without the assistance of health care providers. Qualitative data were gathered from both key informant interviews and focus groups with a broad range of participants. Data analysis revealed that to be effective and accepted, the toolkit should be not only evidence based but also practical, low-cost, self-explanatory, and usable without the assistance of a health care provider. Materials must be engaging, visually interesting, empowering, sensitive to reading level, and appropriate for low-vision users. These findings should be useful to other researchers developing education and awareness materials for older adults in rural areas.

  14. Supporting safe driving with arthritis: developing a driving toolkit for clinical practice and consumer use.

    Science.gov (United States)

    Vrkljan, Brenda H; Cranney, Ann; Worswick, Julia; O'Donnell, Siobhan; Li, Linda C; Gélinas, Isabelle; Byszewski, Anna; Man-Son-Hing, Malcolm; Marshall, Shawn

    2010-01-01

    We conducted a series of focus groups to explore the information needs of clinicians and consumers related to arthritis and driving. An open coding analysis identified common themes across both consumer and clinician-based focus groups that underscored the importance of addressing driving-related concerns and the challenges associated with assessing safety. The results revealed that although driving is critical for maintaining independence and community mobility, drivers with arthritis experience several problems that can affect safe operation of a motor vehicle. Findings from this study are part of a broader research initiative that will inform the development of the Arthritis and Driving toolkit. This toolkit outlines strategies to support safe mobility for people with arthritis and will be an important resource in the coming years given the aging population.

  15. The Arabic culture of Jordan and its impacts on a wider Jordanian adoption of business continuity management.

    Science.gov (United States)

    Sawalha, Ihab H; Meaton, Julia

    2012-01-01

    Culture is important to individuals and societies, as well as organisations. Failing to address cultural aspects will hinder the wider adoption and development of business continuity management (BCM) and will subsequently increase the vulnerabilities of organisations to crises, disasters and business interruptions. Three main issues are discussed in this paper. The first is the background to culture and the characteristics of the Jordanian culture. Secondly, the influence of the Arab culture on the wider adoption and development of BCM in Jordan is considered. Thirdly, the paper looks at potential factors that underpin the role of culture in the BCM process in Jordan. These issues are significant, as they represent the characteristics and influence of the Arab culture. This paper contributes to the understanding of the significance of culture in the adoption and development of BCM for organisations operating in Jordan and in the Arab world more generally. It also highlights current cultural changes and trends taking place in the Arab world in a time of huge political instability in the Middle East and Arab countries.

  16. HYDROGEOLOGICAL AND HYDROGEOCHEMICAL CHARACTERISTICS OF A WIDER AREA OF THE REGIONAL WELL FIELD EASTERN SLAVONIA – SIKIREVCI

    Directory of Open Access Journals (Sweden)

    Jasna Kopić

    2016-10-01

    Full Text Available This paper establishes hydrogeological and hydrogeochemical characteristics of a wider area of the regional well field Eastern Slavonia - Sikirevci. The research was conducted based on data gathered from the area of the Federation of Bosnia and Herzegovina and the Republic of Croatia. The aquifer Velika Kopanica is situated at the territory of the Republic of Croatia in the triangular region formed between Kopanica, Gundinci and Kruševica. The River Sava partially flows through it and the aquifer extends beneath the river to the territory of the Federation of Bosnia and Herzegovina from Donji Svilaj in the West to Domaljevac in the East where its yield is the highest. The thickness of the aquifer decreases towards the water body Odžak. It was determined that the groundwater which is extracted from wells of the wider area of the regional well field contains iron, manganese, natural ammonia and arsenic in values exceeding the maximum allowable concentration for drinking water. The increased values of these parameters are a result of mineral composition and reductive conditions in the aquifer environment. By means of a multivariate statistic cluster analysis, an overview of groups of elements is provided based on geochemical affinity and/or origin.

  17. A wider pelvis does not increase locomotor cost in humans, with implications for the evolution of childbirth.

    Directory of Open Access Journals (Sweden)

    Anna G Warrener

    Full Text Available The shape of the human female pelvis is thought to reflect an evolutionary trade-off between two competing demands: a pelvis wide enough to permit the birth of large-brained infants, and narrow enough for efficient bipedal locomotion. This trade-off, known as the obstetrical dilemma, is invoked to explain the relative difficulty of human childbirth and differences in locomotor performance between men and women. The basis for the obstetrical dilemma is a standard static biomechanical model that predicts wider pelves in females increase the metabolic cost of locomotion by decreasing the effective mechanical advantage of the hip abductor muscles for pelvic stabilization during the single-leg support phase of walking and running, requiring these muscles to produce more force. Here we experimentally test this model against a more accurate dynamic model of hip abductor mechanics in men and women. The results show that pelvic width does not predict hip abductor mechanics or locomotor cost in either women or men, and that women and men are equally efficient at both walking and running. Since a wider birth canal does not increase a woman's locomotor cost, and because selection for successful birthing must be strong, other factors affecting maternal pelvic and fetal size should be investigated in order to help explain the prevalence of birth complications caused by a neonate too large to fit through the birth canal.

  18. Geo-Enabled, Mobile Services

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard

    2006-01-01

    We are witnessing the emergence of a global infrastructure that enables the widespread deployment of geo-enabled, mobile services in practice. At the same time, the research community has also paid increasing attention to data management aspects of mobile services. This paper offers me...

  19. Wider than the Sky

    Science.gov (United States)

    Barbieri, Richard

    2015-01-01

    More has been learned about the human brain in the past few decades than in the whole prior history of humanity. In this article Richard Barbieri considers learning and the brain from a few different perspectives. He begins by examining the practice of neuroscience itself and what was understood about the brain before neuroscience. This leads to a…

  20. Towards a wider dialogue

    CERN Multimedia

    2012-01-01

    This week, I had the rewarding experience of taking part in a Wilton Park meeting examining three very different world-views: science, philosophy and theology. Wilton Park describes itself as a forum for analysing and advancing the agenda on global policy challenges, and over the years it has developed an enviable reputation for delivering authoritative reports drawn from bringing international experts together under the same roof for two days to discuss issues of topical relevance.   Participation is by invitation and there are no observers: everyone is there because they have something to bring to the discussion. Wilton Park reports always have their finger on the zeitgeist, appropriately, perhaps, for an institution born of Winston Churchill’s vision for reconciliation and dialogue in post-war Europe. When I learned that Wilton Park was running a series of meetings examining the role of religion in modern society, and that it was looking at the possibility of holding an event in...

  1. TMVA(Toolkit for Multivariate Analysis) new architectures design and implementation.

    CERN Document Server

    Zapata Mesa, Omar Andres

    2016-01-01

    Toolkit for Multivariate Analysis(TMVA) is a package in ROOT for machine learning algorithms for classification and regression of the events in the detectors. In TMVA, we are developing new high level algorithms to perform multivariate analysis as cross validation, hyper parameter optimization, variable importance etc... Almost all the algorithms are expensive and designed to process a huge amount of data. It is very important to implement the new technologies on parallel computing to reduce the processing times.

  2. SwingStates: Adding state machines to Java and the Swing toolkit

    OpenAIRE

    Appert , Caroline; Beaudouin-Lafon , Michel

    2008-01-01

    International audience; This article describes SwingStates, a Java toolkit designed to facilitate the development of graphical user interfaces and bring advanced interaction techniques to the Java platform. SwingStates is based on the use of finite-state machines specified directly in Java to describe the behavior of interactive systems. State machines can be used to redefine the behavior of existing Swing widgets or, in combination with a new canvas widget that features a rich graphical mode...

  3. Development of a Tailored Methodology and Forensic Toolkit for Industrial Control Systems Incident Response

    Science.gov (United States)

    2014-06-01

    for industrial control systems ,” in Proceedings of the VDE Kongress, 2004. [15] K. Stouffer et al., “Special publication 800-82: Guide to industrial...TAILORED METHODOLOGY AND FORENSIC TOOLKIT FOR INDUSTRIAL CONTROL SYSTEMS INCIDENT RESPONSE by Nicholas B. Carr June 2014 Thesis Co...CONTROL SYSTEMS INCIDENT RESPONSE 5. FUNDING NUMBERS 6. AUTHOR(S) Nicholas B. Carr 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval

  4. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    Science.gov (United States)

    1998-01-01

    devices are exoskeletal in nature. They could be flexible, such as a glove or a suit worn by the user, or they could be rigid, such as jointed linkages...effectiveness of using force control need to be investigated. The MAGIC Toolkit can be used to develop sensory tasks for rehabilitative medicine...display. Proceedings of IEEE Conference on Robotics and Automation, San Diego, CA, May 1994. [4] J. E. Colgate, P. E. Grafing, and M. C. Stanley

  5. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  6. Toward genome-enabled mycology.

    Science.gov (United States)

    Hibbett, David S; Stajich, Jason E; Spatafora, Joseph W

    2013-01-01

    Genome-enabled mycology is a rapidly expanding field that is characterized by the pervasive use of genome-scale data and associated computational tools in all aspects of fungal biology. Genome-enabled mycology is integrative and often requires teams of researchers with diverse skills in organismal mycology, bioinformatics and molecular biology. This issue of Mycologia presents the first complete fungal genomes in the history of the journal, reflecting the ongoing transformation of mycology into a genome-enabled science. Here, we consider the prospects for genome-enabled mycology and the technical and social challenges that will need to be overcome to grow the database of complete fungal genomes and enable all fungal biologists to make use of the new data.

  7. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  8. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  9. Risk assessment of chemicals in foundries: The International Chemical Toolkit pilot-project

    International Nuclear Information System (INIS)

    Ribeiro, Marcela G.; Filho, Walter R.P.

    2006-01-01

    In Brazil, problems regarding protection from hazardous substances in small-sized enterprises are similar to those observed in many other countries. Looking for a simple tool to assess and control such exposures, FUNDACENTRO has started in 2005 a pilot-project to implement the International Chemical Control Toolkit. During the series of visits to foundries, it was observed that although many changes have occurred in foundry technology, occupational exposures to silica dust and metal fumes continue to occur, due to a lack of perception of occupational exposure in the work environment. After introducing the Chemical Toolkit concept to the foundry work group, it was possible to show that the activities undertaken to improve the management of chemicals, according to its concept, will support companies in fulfilling government legislations related to chemical management, occupational health and safety, and environmental impact. In the following meetings, the foundry work group and FUNDACENTRO research team will identify 'inadequate work situations'. Based on the Chemical Toolkit, improvement measures will be proposed. Afterwards, a survey will verify the efficency of those measures in the control of hazards and consequently on the management of chemicals. This step is now in course

  10. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    Science.gov (United States)

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  11. Improving primary care for persons with spinal cord injury: Development of a toolkit to guide care.

    Science.gov (United States)

    Milligan, James; Lee, Joseph; Hillier, Loretta M; Slonim, Karen; Craven, Catharine

    2018-05-07

    To identify a set of essential components for primary care for patients with spinal cord injury (SCI) for inclusion in a point-of-practice toolkit for primary care practitioners (PCP) and identification of the essential elements of SCI care that are required in primary care and those that should be the focus of specialist care. Modified Delphi consensus process; survey methodology. Primary care. Three family physicians, six specialist physicians, and five inter-disciplinary health professionals completed surveys. Importance of care elements for inclusion in the toolkit (9-point scale: 1 = lowest level of importance, 9 = greatest level of importance) and identification of most responsible physician (family physician, specialist) for completing key categories of care. Open-ended comments were solicited. There was consensus between the respondent groups on the level of importance of various care elements. Mean importance scores were highest for autonomic dysreflexia, pain, and skin care and lowest for preventive care, social issues, and vital signs. Although, there was agreement across all respondents that family physicians should assume responsibility for assessing mental health, there was variability in who should be responsible for other care categories. Comments were related to the need for shared care approaches and capacity building and lack of knowledge and specialized equipment as barriers to optimal care. This study identified important components of SCI care to be included in a point-of-practice toolkit to facilitate primary care for persons with SCI.

  12. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  13. Midwives in medical student and resident education and the development of the medical education caucus toolkit.

    Science.gov (United States)

    Radoff, Kari; Nacht, Amy; Natch, Amy; McConaughey, Edie; Salstrom, Jan; Schelling, Karen; Seger, Suzanne

    2015-01-01

    Midwives have been involved formally and informally in the training of medical students and residents for many years. Recent reductions in resident work hours, emphasis on collaborative practice, and a focus on midwives as key members of the maternity care model have increased the involvement of midwives in medical education. Midwives work in academic settings as educators to teach the midwifery model of care, collaboration, teamwork, and professionalism to medical students and residents. In 2009, members of the American College of Nurse-Midwives formed the Medical Education Caucus (MECA) to discuss the needs of midwives teaching medical students and residents; the group has held a workshop annually over the last 4 years. In 2014, MECA workshop facilitators developed a toolkit to support and formalize the role of midwives involved in medical student and resident education. The MECA toolkit provides a roadmap for midwives beginning involvement and continuing or expanding the role of midwives in medical education. This article describes the history of midwives in medical education, the development and growth of MECA, and the resulting toolkit created to support and formalize the role of midwives as educators in medical student and resident education, as well as common challenges for the midwife in academic medicine. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. © 2015 by the American College of Nurse-Midwives.

  14. X-CSIT: a toolkit for simulating 2D pixel detectors

    Science.gov (United States)

    Joy, A.; Wing, M.; Hauf, S.; Kuster, M.; Rüter, T.

    2015-04-01

    A new, modular toolkit for creating simulations of 2D X-ray pixel detectors, X-CSIT (X-ray Camera SImulation Toolkit), is being developed. The toolkit uses three sequential simulations of detector processes which model photon interactions, electron charge cloud spreading with a high charge density plasma model and common electronic components used in detector readout. In addition, because of the wide variety in pixel detector design, X-CSIT has been designed as a modular platform so that existing functions can be modified or additional functionality added if the specific design of a detector demands it. X-CSIT will be used to create simulations of the detectors at the European XFEL, including three bespoke 2D detectors: the Adaptive Gain Integrating Pixel Detector (AGIPD), Large Pixel Detector (LPD) and DePFET Sensor with Signal Compression (DSSC). These simulations will be used by the detector group at the European XFEL for detector characterisation and calibration. For this purpose, X-CSIT has been integrated into the European XFEL's software framework, Karabo. This will further make it available to users to aid with the planning of experiments and analysis of data. In addition, X-CSIT will be released as a standalone, open source version for other users, collaborations and groups intending to create simulations of their own detectors.

  15. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    Science.gov (United States)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  16. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    Science.gov (United States)

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.

  17. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  18. Template-based education toolkit for mobile platforms

    Science.gov (United States)

    Golagani, Santosh Chandana; Esfahanian, Moosa; Akopian, David

    2012-02-01

    Nowadays mobile phones are the most widely used portable devices which evolve very fast adding new features and improving user experiences. The latest generation of hand-held devices called smartphones is equipped with superior memory, cameras and rich multimedia features, empowering people to use their mobile phones not only as a communication tool but also for entertainment purposes. With many young students showing interest in learning mobile application development one should introduce novel learning methods which may adapt to fast technology changes and introduce students to application development. Mobile phones become a common device, and engineering community incorporates phones in various solutions. Overcoming the limitations of conventional undergraduate electrical engineering (EE) education this paper explores the concept of template-based based education in mobile phone programming. The concept is based on developing small exercise templates which students can manipulate and revise for quick hands-on introduction to the application development and integration. Android platform is used as a popular open source environment for application development. The exercises relate to image processing topics typically studied by many students. The goal is to enable conventional course enhancements by incorporating in them short hands-on learning modules.

  19. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    Science.gov (United States)

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. Copyright © 2011 Wiley Periodicals, Inc.

  20. New insights in geodynamics of wider Zagreb area: results of GPS measurements series 2009 on Zagreb Geodynamic Network

    Science.gov (United States)

    Pribičević, Boško; Medak, Damir; ĐApo, Almin

    2010-05-01

    The Geodynamic GPS-Network of the City of Zagreb represents the longest and the most intensive research effort in the field of geodynamics in Croatia. Since the establishment of the Network in 1997, several series of precise GPS measurements have been conducted on specially stabilized points of Geodynamical Network of City of Zagreb with purpose of investigation of tectonic movements and related seismic activity of the wider area of the City of Zagreb. The Network has been densified in 2005 in the most active region of northeastern Mount Medvednica. Since then, several GPS campaigns have been conducted including the last in summer 2009. The paper presents latest results of geodynamic movements of the network points.

  1. Computer Security Systems Enable Access.

    Science.gov (United States)

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  2. How GNSS Enables Precision Farming

    Science.gov (United States)

    2014-12-01

    Precision farming: Feeding a Growing Population Enables Those Who Feed the World. Immediate and Ongoing Needs - population growth (more to feed) - urbanization (decrease in arable land) Double food production by 2050 to meet world demand. To meet thi...

  3. Implementation of the Good School Toolkit in Uganda: a quantitative process evaluation of a successful violence prevention program.

    Science.gov (United States)

    Knight, Louise; Allen, Elizabeth; Mirembe, Angel; Nakuti, Janet; Namy, Sophie; Child, Jennifer C; Sturgess, Joanna; Kyegombe, Nambusi; Walakira, Eddy J; Elbourne, Diana; Naker, Dipak; Devries, Karen M

    2018-05-09

    The Good School Toolkit, a complex behavioural intervention designed by Raising Voices a Ugandan NGO, reduced past week physical violence from school staff to primary students by an average of 42% in a recent randomised controlled trial. This process evaluation quantitatively examines what was implemented across the twenty-one intervention schools, variations in school prevalence of violence after the intervention, factors that influence exposure to the intervention and factors associated with students' experience of physical violence from staff at study endline. Implementation measures were captured prospectively in the twenty-one intervention schools over four school terms from 2012 to 2014 and Toolkit exposure captured in the student (n = 1921) and staff (n = 286) endline cross-sectional surveys in 2014. Implementation measures and the prevalence of violence are summarised across schools and are assessed for correlation using Spearman's Rank Correlation Coefficient. Regression models are used to explore individual factors associated with Toolkit exposure and with physical violence at endline. School prevalence of past week physical violence from staff against students ranged from 7% to 65% across schools at endline. Schools with higher mean levels of teacher Toolkit exposure had larger decreases in violence during the study. Students in schools categorised as implementing a 'low' number of program school-led activities reported less exposure to the Toolkit. Higher student Toolkit exposure was associated with decreased odds of experiencing physical violence from staff (OR: 0.76, 95%CI: 0.67-0.86, p-valueEffectiveness of the Toolkit may be increased by further targeting and supporting teachers' engagement with girls and students with mental health difficulties. The trial is registered at clinicaltrials.gov , NCT01678846, August 24th 2012.

  4. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    Directory of Open Access Journals (Sweden)

    Jon Smart

    2018-02-01

    Full Text Available Introduction: Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods: As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Results: Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Conclusion: Residents from across the world collaborated and convened to reach a consensus on high-yield—and potentially high-impact—lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  5. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit.

    Science.gov (United States)

    Chung, Arlene S; Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-03-01

    Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Residents from across the world collaborated and convened to reach a consensus on high-yield-and potentially high-impact-lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  6. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Bowd

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  7. A system for rapid prototyping of hearts with congenital malformations based on the medical imaging interaction toolkit (MITK)

    Science.gov (United States)

    Wolf, Ivo; Böttger, Thomas; Rietdorf, Urte; Maleike, Daniel; Greil, Gerald; Sieverding, Ludger; Miller, Stephan; Mottl-Link, Sibylle; Meinzer, Hans-Peter

    2006-03-01

    Precise knowledge of the individual cardiac anatomy is essential for diagnosis and treatment of congenital heart disease. Complex malformations of the heart can best be comprehended not from images but from anatomic specimens. Physical models can be created from data using rapid prototyping techniques, e.g., lasersintering or 3D-printing. We have developed a system for obtaining data that show the relevant cardiac anatomy from high-resolution CT/MR images and are suitable for rapid prototyping. The challenge is to preserve all relevant details unaltered in the produced models. The main anatomical structures of interest are the four heart cavities (atria, ventricles), the valves and the septum separating the cavities, and the great vessels. These can be shown either by reproducing the morphology itself or by producing a model of the blood-pool, thus creating a negative of the morphology. Algorithmically the key issue is segmentation. Practically, possibilities allowing the cardiologist or cardiac surgeon to interactively check and correct the segmentation are even more important due to the complex, irregular anatomy and imaging artefacts. The paper presents the algorithmic and interactive processing steps implemented in the system, which is based on the open-source Medical Imaging Interaction Toolkit (MITK, www.mitk.org). It is shown how the principles used in MITK enable to assemble the system from modules (functionalities) developed independently from each other. The system allows to produce models of the heart (and other anatomic structures) of individual patients as well as to reproduce unique specimens from pathology collections for teaching purposes.

  8. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  9. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  10. OGC® Sensor Web Enablement Standards

    Directory of Open Access Journals (Sweden)

    George Percivall

    2006-09-01

    Full Text Available This article provides a high-level overview of and architecture for the Open Geospatial Consortium (OGC standards activities that focus on sensors, sensor networks, and a concept called the “Sensor Web”. This OGC work area is known as Sensor Web Enablement (SWE. This article has been condensed from "OGC® Sensor Web Enablement: Overview And High Level Architecture," an OGC White Paper by Mike Botts, PhD, George Percivall, Carl Reed, PhD, and John Davidson which can be downloaded from http://www.opengeospatial.org/pt/15540. Readers interested in greater technical and architecture detail can download and read the OGC SWE Architecture Discussion Paper titled “The OGC Sensor Web Enablement Architecture” (OGC document 06-021r1, http://www.opengeospatial.org/pt/14140.

  11. Employee attitudes towards aggression in persons with dementia: Readiness for wider adoption of person-centered frameworks.

    Science.gov (United States)

    Burshnic, V L; Douglas, N F; Barker, R M

    2018-04-01

    Person-centered care, as compared to standard approaches, is a widely accepted, evidence-based approach for managing aggressive behaviour in persons with dementia. The attitudes, beliefs and values of long-term care and mental health nursing employees are important prerequisites to implementing person-centered practices. Research shows that nursing employees typically support person-centered approaches; however, less is known about the attitudes of non-nursing employee groups. Nurse managers and administrators tended to agree with person-centered approaches for managing aggression in dementia, suggesting some prerequisites are in place to support wider adoption of person-centered frameworks. Employees with more resident contact tended to support person-centered approaches the least, suggesting discipline-specific trainings may not be adequate for preparing frontline staff to use person-centered techniques. Attitudes towards aggressive behaviour may be especially varied and contradictory within certain employee groups, providing implications for facility-wide initiatives. Person-centered values and practices should be monitored and reinforced across the organization. Person-centered trainings should be interdisciplinary in nature and focused on care areas, such as mealtime or bathing. Long-term care facilities should consider allowing nurse management and registered nurses to share the burden of direct resident care with frontline employees on a more regular basis. Introduction Implementing person-centered care requires shared attitudes, beliefs and values among all care employees. Existing research has failed to examine the attitudes of non-nursing employees. Aim This study examined attitudes towards aggression among nursing and non-nursing employees to address gaps in existing research and assess readiness for wider adoption of person-centered frameworks. Method The Management of Aggression in People with Dementia Attitude Questionnaire was used to survey

  12. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, Ritesh, E-mail: ritesh@ipr.res.in; Swamy, Rajamannar, E-mail: rajamannar@ipr.res.in; Khirwadkar, Samir, E-mail: sameer@ipr.res.in

    2016-11-15

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  13. Use of EPICS and Python technology for the development of a computational toolkit for high heat flux testing of plasma facing components

    International Nuclear Information System (INIS)

    Sugandhi, Ritesh; Swamy, Rajamannar; Khirwadkar, Samir

    2016-01-01

    Highlights: • An integrated approach to software development for computational processing and experimental control. • Use of open source, cross platform, robust and advanced tools for computational code development. • Prediction of optimized process parameters for critical heat flux model. • Virtual experimentation for high heat flux testing of plasma facing components. - Abstract: The high heat flux testing and characterization of the divertor and first wall components are a challenging engineering problem of a tokamak. These components are subject to steady state and transient heat load of high magnitude. Therefore, the accurate prediction and control of the cooling parameters is crucial to prevent burnout. The prediction of the cooling parameters is based on the numerical solution of the critical heat flux (CHF) model. In a test facility for high heat flux testing of plasma facing components (PFC), the integration of computations and experimental control is an essential requirement. Experimental physics and industrial control system (EPICS) provides powerful tools for steering controls, data simulation, hardware interfacing and wider usability. Python provides an open source alternative for numerical computations and scripting. We have integrated these two open source technologies to develop a graphical software for a typical high heat flux experiment. The implementation uses EPICS based tools namely IOC (I/O controller) server, control system studio (CSS) and Python based tools namely Numpy, Scipy, Matplotlib and NOSE. EPICS and Python are integrated using PyEpics library. This toolkit is currently under operation at high heat flux test facility at Institute for Plasma Research (IPR) and is also useful for the experimental labs working in the similar research areas. The paper reports the software architectural design, implementation tools and rationale for their selection, test and validation.

  14. Evolving the US Climate Resilience Toolkit to Support a Climate-Smart Nation

    Science.gov (United States)

    Tilmes, C.; Niepold, F., III; Fox, J. F.; Herring, D.; Dahlman, L. E.; Hall, N.; Gardiner, N.

    2015-12-01

    Communities, businesses, resource managers, and decision-makers at all levels of government need information to understand and ameliorate climate-related risks. Likewise, climate information can expose latent opportunities. Moving from climate science to social and economic decisions raises complex questions about how to communicate the causes and impacts of climate variability and change; how to characterize and quantify vulnerabilities, risks, and opportunities faced by communities and businesses; and how to make and implement "win-win" adaptation plans at local, regional, and national scales. A broad coalition of federal agencies launched the U.S. Climate Resilience Toolkit (toolkit.climate.gov) in November 2014 to help our nation build resilience to climate-related extreme events. The site's primary audience is planners and decision makers in business, resource management, and government (at all levels) who seek science-based climate information and tools to help them in their near- and long-term planning. The Executive Office of the President assembled a task force of dozens of subject experts from across the 13 agencies of the U.S. Global Change Research Program to guide the site's development. The site's ongoing evolution is driven by feedback from the target audience. For example, based on feedback, climate projections will soon play a more prominent role in the site's "Climate Explorer" tool and case studies. The site's five-step adaptation planning process is being improved to better facilitate people getting started and to provide clear benchmarks for evaluating progress along the way. In this session, we will share lessons learned from a series of user engagements around the nation and evidence that the Toolkit couples climate information with actionable decision-making processes in ways that are helping Americans build resilience to climate-related stressors.

  15. Effect of an educational toolkit on quality of care: a pragmatic cluster randomized trial.

    Science.gov (United States)

    Shah, Baiju R; Bhattacharyya, Onil; Yu, Catherine H Y; Mamdani, Muhammad M; Parsons, Janet A; Straus, Sharon E; Zwarenstein, Merrick

    2014-02-01

    Printed educational materials for clinician education are one of the most commonly used approaches for quality improvement. The objective of this pragmatic cluster randomized trial was to evaluate the effectiveness of an educational toolkit focusing on cardiovascular disease screening and risk reduction in people with diabetes. All 933,789 people aged ≥40 years with diagnosed diabetes in Ontario, Canada were studied using population-level administrative databases, with additional clinical outcome data collected from a random sample of 1,592 high risk patients. Family practices were randomly assigned to receive the educational toolkit in June 2009 (intervention group) or May 2010 (control group). The primary outcome in the administrative data study, death or non-fatal myocardial infarction, occurred in 11,736 (2.5%) patients in the intervention group and 11,536 (2.5%) in the control group (p = 0.77). The primary outcome in the clinical data study, use of a statin, occurred in 700 (88.1%) patients in the intervention group and 725 (90.1%) in the control group (p = 0.26). Pre-specified secondary outcomes, including other clinical events, processes of care, and measures of risk factor control, were also not improved by the intervention. A limitation is the high baseline rate of statin prescribing in this population. The educational toolkit did not improve quality of care or cardiovascular outcomes in a population with diabetes. Despite being relatively easy and inexpensive to implement, printed educational materials were not effective. The study highlights the need for a rigorous and scientifically based approach to the development, dissemination, and evaluation of quality improvement interventions. http://www.ClinicalTrials.gov NCT01411865 and NCT01026688.

  16. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Directory of Open Access Journals (Sweden)

    K Anderson

    Full Text Available This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app', so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping.

  17. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    Science.gov (United States)

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  18. Decision support toolkit for integrated analysis and design of reclaimed water infrastructure.

    Science.gov (United States)

    Lee, Eun Jung; Criddle, Craig S; Geza, Mengistu; Cath, Tzahi Y; Freyberg, David L

    2018-05-01

    Planning of water reuse systems is a complex endeavor. We have developed a software toolkit, IRIPT (Integrated Urban Reclaimed Water Infrastructure Planning Toolkit) that facilitates planning and design of reclaimed water infrastructure for both centralized and hybrid configurations that incorporate satellite treatment plants (STPs). The toolkit includes a Pipeline Designer (PRODOT) that optimizes routing and sizing of pipelines for wastewater capture and reclaimed water distribution, a Selector (SelWTP) that assembles and optimizes wastewater treatment trains, and a Calculator (CalcBenefit) that estimates fees, revenues, and subsidies of alternative designs. For hybrid configurations, a Locator (LocSTP) optimizes siting of STPs and associated wastewater diversions by identifying manhole locations where the flowrates are sufficient to ensure that wastewater extracted and treated at an adjacent STP can generate the revenue needed to pay for treatment and delivery to customers. Practical local constraints are also applied to screen and identify STP locations. Once suitable sites are selected, System Integrator (ToolIntegrator) identifies a set of centralized and hybrid configurations that: (1) maximize reclaimed water supply, (2) maximize reclaimed water supply while also ensuring a financial benefit for the system, and (3) maximize the net financial benefit for the system. The resulting configurations are then evaluated by an Analyst (SANNA) that uses monetary and non-monetary criteria, with weights assigned to appropriate metrics by a decision-maker, to identify a preferred configuration. To illustrate the structure, assumptions, and use of IRIPT, we apply it to a case study for the city of Golden, CO. The criteria weightings provided by a local decision-maker lead to a preference for a centralized configuration in this case. The Golden case study demonstrates that IRIPT can efficiently analyze centralized and hybrid water reuse configurations and rank them

  19. PS1-29: Resources to Facilitate Multi-site Collaboration: the PRIMER Research Toolkit

    Science.gov (United States)

    Greene, Sarah; Thompson, Ella; Baldwin, Laura-Mae; Neale, Anne Victoria; Dolor, Rowena

    2010-01-01

    repository: www.ResearchToolkit.org, which is comprised of over 120 distinct resources. Conclusions: We are disseminating the ResearchToolkit website via academic and media channels, and identifying options for making it a sustainable resource. Given the dynamic nature of the research enterprise, maintenance and accuracy of a web-based resource is challenging. Still, the positive response to the toolkit suggests that there is high interest in sustaining it. We will demonstrate the Toolkit as part of this conference.

  20. How to create an interface between UrQMD and Geant4 toolkit

    CERN Document Server

    Abdel-Waged, Khaled; Uzhinskii, V.V.

    2012-01-01

    An interface between the UrQMD-1.3cr model (version 1.3 for cosmic air showers) and the Geant4 transport toolkit has been developed. Compared to the current Geant4 (hybrid) hadronic models, this provides the ability to simulate at the microscopic level hadron, nucleus, and anti-nucleus interactions with matter from 0 to 1 TeV with a single transport code. This document provides installation requirements and instructions, as well as class and member function descriptions of the software.

  1. Application of the SHARP Toolkit to Sodium-Cooled Fast Reactor Challenge Problems

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Yu, Y. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Kim, T. K. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2017-09-30

    The Simulation-based High-efficiency Advanced Reactor Prototyping (SHARP) toolkit is under development by the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign of the U.S. Department of Energy, Office of Nuclear Energy. To better understand and exploit the benefits of advanced modeling simulations, the NEAMS Campaign initiated the “Sodium-Cooled Fast Reactor (SFR) Challenge Problems” task, which include the assessment of hot channel factors (HCFs) and the demonstration of zooming capability using the SHARP toolkit. If both challenge problems are resolved through advanced modeling and simulation using the SHARP toolkit, the economic competitiveness of a SFR can be significantly improved. The efforts in the first year of this project focused on the development of computational models, meshes, and coupling procedures for multi-physics calculations using the neutronics (PROTEUS) and thermal-hydraulic (Nek5000) components of the SHARP toolkit, as well as demonstration of the HCF calculation capability for the 100 MWe Advanced Fast Reactor (AFR-100) design. Testing the feasibility of the SHARP zooming capability is planned in FY 2018. The HCFs developed for the earlier SFRs (FFTF, CRBR, and EBR-II) were reviewed, and a subset of these were identified as potential candidates for reduction or elimination through high-fidelity simulations. A one-way offline coupling method was used to evaluate the HCFs where the neutronics solver PROTEUS computes the power profile based on an assumed temperature, and the computational fluid dynamics solver Nek5000 evaluates the peak temperatures using the neutronics power profile. If the initial temperature profile used in the neutronics calculation is reasonably accurate, the one-way offline method is valid because the neutronics power profile has weak dependence on small temperature variation. In order to get more precise results, the proper temperature profile for initial neutronics calculations was obtained from the

  2. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pshenichnov, Igor, E-mail: pshenich@fias.uni-frankfurt.d [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Botvina, Alexander [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Institute for Nuclear Research, Russian Academy of Science, 117312 Moscow (Russian Federation); Mishustin, Igor [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany); Kurchatov Institute, Russian Research Center, 123182 Moscow (Russian Federation); Greiner, Walter [Frankfurt Institute for Advanced Studies, J.-W. Goethe University, 60438 Frankfurt am Main (Germany)

    2010-03-15

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  3. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    International Nuclear Information System (INIS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-01-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100AMeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  4. The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.

    Science.gov (United States)

    Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen

    2010-12-21

    There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases

  5. Software Toolkits: Practical Aspects of the Internet of Things—A Survey

    OpenAIRE

    Wang, Feng; Hu, Liang; Zhou, Jin; Wu, Yang; Hu, Jiejun; Zhao, Kuo

    2015-01-01

    The Internet of Things (IoT) is neither science fiction nor industry hype; rather it is based on solid technological advances and visions of network ubiquity that are zealously being realized. The paper serves to provide guidance regarding the practical aspects of the IoT. Such guidance is largely missing in the current literature in which the focus has been more on research problems and less on issues describing how to set up an IoT system and what software toolkits are required. This paper ...

  6. canvasDesigner: A versatile interactive high-resolution scientific multi-panel visualization toolkit.

    Science.gov (United States)

    Zhang, Baohong; Zhao, Shanrong; Neuhaus, Isaac

    2018-05-03

    We present a bioinformatics and systems biology visualization toolkit harmonizing real time interactive exploring and analyzing of big data, full-fledged customizing of look-n-feel, and producing multi-panel publication-ready figures in PDF format simultaneously. Source code and detailed user guides are available at http://canvasxpress.org, https://baohongz.github.io/canvasDesigner, and https://baohongz.github.io/canvasDesigner/demo_video.html. isaac.neuhaus@bms.com, baohong.zhang@pfizer.com, shanrong.zhao@pfizer.com. Supplementary materials are available at https://goo.gl/1uQygs.

  7. The MicroAnalysis Toolkit: X-ray Fluorescence Image Processing Software

    International Nuclear Information System (INIS)

    Webb, S. M.

    2011-01-01

    The MicroAnalysis Toolkit is an analysis suite designed for the processing of x-ray fluorescence microprobe data. The program contains a wide variety of analysis tools, including image maps, correlation plots, simple image math, image filtering, multiple energy image fitting, semi-quantitative elemental analysis, x-ray fluorescence spectrum analysis, principle component analysis, and tomographic reconstructions. To be as widely useful as possible, data formats from many synchrotron sources can be read by the program with more formats available by request. An overview of the most common features will be presented.

  8. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    Science.gov (United States)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  9. OpenDBDDAS Toolkit: Secure MapReduce and Hadoop-like Systems

    KAUST Repository

    Fabiano, Enrico

    2015-06-01

    The OpenDBDDAS Toolkit is a software framework to provide support for more easily creating and expanding dynamic big data-driven application systems (DBDDAS) that are common in environmental systems, many engineering applications, disaster management, traffic management, and manufacturing. In this paper, we describe key features needed to implement a secure MapReduce and Hadoop-like system for high performance clusters that guarantees a certain level of privacy of data from other concurrent users of the system. We also provide examples of a secure MapReduce prototype and compare it to another high performance MapReduce, MR-MPI.

  10. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    Science.gov (United States)

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  11. Microgrid Design Toolkit (MDT) User Guide Software v1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Microgrid Design Toolkit (MDT) supports decision analysis for new ("greenfield") microgrid designs as well as microgrids with existing infrastructure. The current version of MDT includes two main capabilities. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new, grid connected microgrid in the early stages of the design process. MSC is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on designing a microgrid for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM).

  12. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  13. Fluorescent Bisphosphonate and Carboxyphosphonate Probes: A Versatile Imaging Toolkit for Applications in Bone Biology and Biomedicine.

    Science.gov (United States)

    Sun, Shuting; Błażewska, Katarzyna M; Kadina, Anastasia P; Kashemirov, Boris A; Duan, Xuchen; Triffitt, James T; Dunford, James E; Russell, R Graham G; Ebetino, Frank H; Roelofs, Anke J; Coxon, Fraser P; Lundy, Mark W; McKenna, Charles E

    2016-02-17

    A bone imaging toolkit of 21 fluorescent probes with variable spectroscopic properties, bone mineral binding affinities, and antiprenylation activities has been created, including a novel linking strategy. The linking chemistry allows attachment of a diverse selection of dyes fluorescent in the visible to near-infrared range to any of the three clinically important heterocyclic bisphosphonate bone drugs (risedronate, zoledronate, and minodronate or their analogues). The resultant suite of conjugates offers multiple options to "mix and match" parent drug structure, fluorescence emission wavelength, relative bone affinity, and presence or absence of antiprenylation activity, for bone-related imaging applications.

  14. Mission creep or responding to wider security needs? The evolving role of mine action organisations in Armed Violence Reduction

    Directory of Open Access Journals (Sweden)

    Sharmala Naidoo

    2013-04-01

    Full Text Available Since the late 1980s, mine action organisations have focused their efforts on reducing the social, economic and environmental impacts of anti-personnel mines and other explosive remnants of war (ERW through a broad range of activities, including survey, clearance, mine risk education (MRE, victim assistance, stockpile destruction and advocacy. In recent years, an increasing number of mine action organisations are using their mine action technical expertise and their capacities to operate in difficult environments to reduce armed violence and promote public safety. Several organisations now have armed violence reduction (AVR-related policies, programmes and staff in place. Some may argue that this shift towards AVR is a diversion from the core mandate of mine action organisations. But does this represent a loss of focus and thereby ‘mission creep’ on the part of these organisations? This practice note examines the factors underlying the evolving role of mine action organisations, discusses how these new programmes are contributing to the wider domain of AVR and explores whether these new programmes have resulted in a loss of organisational focus.

  15. A comparison of eight country plans for the Invasive Indo-Pacific Lionfish in the Wider Caribbean

    Directory of Open Access Journals (Sweden)

    Roxanne E. Graham

    2017-10-01

    Full Text Available The effects of climate change and marine invasive species have posed a major threat to significant ecological, aesthetic, economic and amenity value to the countries and territories of the Wider Caribbean Region. Today, the Caribbean Sea is plagued with the invasive lionfish (Pterois volitans and P. miles. As the range and abundance of the lionfish throughout the Caribbean has grown, recognition of the grave threat it poses to the native marine ecosystems has prompted the development of lionfish management plans across the region. The efforts of eight countries in the region to manage lionfish are evaluated using the US Environmental Protection Agency Aquatic Invasive Species framework and the inclusion of climate change and/or changing conditions. The countries and overseas territories evaluated were Anguilla, Bahamas, Cayman Islands, Grenada, St. Eustatius, St. Lucia, St. Vincent and the US Virgin Islands. Although specific strategies differed amongst the islands depending upon needs, culture, and individual circumstances, most of the plans included aspects of education and outreach, control and monitoring protocols, and research and information management. Areas that were found to be notably weak to nonexistent included leadership, prevention, early detection and rapid response and restoration; This comparative analysis provides opportunities for knowledge sharing and intra- and inter-country cooperation, facilitating the transfer and development of interventions that contribute to the conservation of significant island biodiversity.

  16. Tailoring leisure to suit a wider audience through creative event planning with a multi-sensory approach.

    Science.gov (United States)

    Stonier, Claire L

    2008-01-01

    Caregiving for long-term conditions is increasingly focused on holistic "person centred" care [9,34], with leisure and recreation providing an important and essential part of maintaining quality of life. This article documents examples of large leisure events and creative projects. These were adapted for, and considered to be suitable and supportive of, the needs of adults with complex and profound disability as a result of neurological damage or disease. The ways in which events have been tailored by the Recreation and Leisure Service, incorporating sensory elements with the view to increased accessibility and enjoyment for participants, are highlighted in this article. The ultimate challenge faced was programming events to suit more than 170 people aged over 18 who each have particular preferences, varied interests and abilities including the most profound physical and cognitive impairments. These developments and changes in format have encouraged essential input from participants themselves and their families and carers, whilst involving the wider community; volunteers, external charitable groups and professional organisations.

  17. Casting wider nets for anxiety and depression: disability-driven cross-diagnostic subtypes in a large cohort.

    Science.gov (United States)

    Wanders, R B K; van Loo, H M; Vermunt, J K; Meijer, R R; Hartman, C A; Schoevers, R A; Wardenaar, K J; de Jonge, P

    2016-12-01

    In search of empirical classifications of depression and anxiety, most subtyping studies focus solely on symptoms and do so within a single disorder. This study aimed to identify and validate cross-diagnostic subtypes by simultaneously considering symptoms of depression and anxiety, and disability measures. A large cohort of adults (Lifelines, n = 73 403) had a full assessment of 16 symptoms of mood and anxiety disorders, and measurement of physical, social and occupational disability. The best-fitting subtyping model was identified by comparing different hybrid mixture models with and without disability covariates on fit criteria in an independent test sample. The best model's classes were compared across a range of external variables. The best-fitting Mixed Measurement Item Response Theory model with disability covariates identified five classes. Accounting for disability improved differentiation between people reporting isolated non-specific symptoms ['Somatic' (13.0%), and 'Worried' (14.0%)] and psychopathological symptoms ['Subclinical' (8.8%), and 'Clinical' (3.3%)]. Classes showed distinct associations with clinically relevant external variables [e.g. somatization: odds ratio (OR) 8.1-12.3, and chronic stress: OR 3.7-4.4]. The Subclinical class reported symptomatology at subthreshold levels while experiencing disability. No pure depression or anxiety, but only mixed classes were found. An empirical classification model, incorporating both symptoms and disability identified clearly distinct cross-diagnostic subtypes, indicating that diagnostic nets should be cast wider than current phenomenology-based categorical systems.

  18. Population aging from 1950 to 2010 in seventeen transitional countries in the wider region of South Eastern Europe

    Directory of Open Access Journals (Sweden)

    Mihajlo Jakovljevic

    2015-12-01

    Full Text Available Aim: Population aging has profoundly reshaped demographic landscapes in all South Eastern European (SEE countries. The aim of this study was to provide a thorough comparative inter-country assessment on the speed of population aging in the entire SEE region for the period 1950-2010. Methods: Descriptive observational analysis of long-term trends on core primary and composite indicators of population aging across seventeen countries of the wider SEE region, with panel data sets at a national level. Results: During the past six decades, the entire SEE region has experienced a rapid increase in the median age (from 25.2 years in 1950 to 37.9 years in 2010, with a simultaneous fall of fertility rates for two children per woman (from 3.55 children per each childbearing woman in 1950 to 1.49 in 2010, coupled with significant rise in the population of elderly citizens. The speed of population aging has vastly accelerated (with a 2.5 fold increase over the past three decades. The percentage of individuals over 65 years has doubled from 7% in 1950 to 14% in 2010. Conclusion: Complex national strategies are needed to cope with the shrinking labour force coupled with the growing proportion of the older population. With all likelihood, population aging will further accelerate in the near future. This profound long-term demographic transition will threaten financial sustainability of current health systems in all SEE countries.

  19. A patient and public involvement (PPI) toolkit for meaningful and flexible involvement in clinical trials - a work in progress.

    Science.gov (United States)

    Bagley, Heather J; Short, Hannah; Harman, Nicola L; Hickey, Helen R; Gamble, Carrol L; Woolfall, Kerry; Young, Bridget; Williamson, Paula R

    2016-01-01

    Funders of research are increasingly requiring researchers to involve patients and the public in their research. Patient and public involvement (PPI) in research can potentially help researchers make sure that the design of their research is relevant, that it is participant friendly and ethically sound. Using and sharing PPI resources can benefit those involved in undertaking PPI, but existing PPI resources are not used consistently and this can lead to duplication of effort. This paper describes how we are developing a toolkit to support clinical trials teams in a clinical trials unit. The toolkit will provide a key 'off the shelf' resource to support trial teams with limited resources, in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: ● listen to the views and experience of both research teams and patient and public contributors who use the tools; ● modify the tools based on our experience of using them; ● identify the need for future tools; ● update the toolkit based on any newly identified resources that come to light; ● raise awareness of the toolkit and ● work in collaboration with others to either develop or test out PPI resources in order to reduce duplication of work in PPI. Background Patient and public involvement (PPI) in research is increasingly a funder requirement due to the potential benefits in the design of relevant, participant friendly, ethically sound research. The use and sharing of resources can benefit PPI, but available resources are not consistently used leading to duplication of effort. This paper describes a developing toolkit to support clinical trials teams to undertake effective and meaningful PPI. Methods The first phase in developing the toolkit was to describe which PPI activities should be considered in the pathway of a clinical trial and at what stage these activities should take place. This pathway was informed through review of the type and timing of PPI activities within

  20. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  1. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  2. Organizational Enablers for Project Governance

    DEFF Research Database (Denmark)

    Müller, Ralf; Shao, Jingting; Pemsel, Sofia

    and their relationships to organizational success. Based on these results, the authors discovered that organizational enablers (including key factors such as leadership, governance, and influence of project managers) have a critical impact on how organizations operate, adapt to market fluctuations and forces, and make......While corporate culture plays a significant role in the success of any corporation, governance and “governmentality” not only determine how business should be conducted, but also define the policies and procedures organizations follow to achieve business functions and goals. In their book......, Organizational Enablers for Project Governance, Ralf Müller, Jingting Shao, and Sofia Pemsel examine the interaction of governance and governmentality in various types of companies and demonstrate how these factors drive business success and influence project work, efficiency, and profitability. The data...

  3. Basin Testing of Wave Energy Converters in Trondheim: Investigation of Mooring Loads and Implications for Wider Research

    Directory of Open Access Journals (Sweden)

    Vladimir Krivtsov

    2014-04-01

    Full Text Available This paper describes the physical model testing of an array of wave energy devices undertaken in the NTNU (Norwegian University of Science and Technology Trondheim basin between 8 and 20 October 2008 funded under the EU Hydralabs III initiative, and provides an analysis of the extreme mooring loads. Tests were completed at 1/20 scale on a single oscillating water column device and on close-packed arrays of three and five devices following calibration of instrumentation and the wave and current test environment. One wave energy converter (WEC was fully instrumented with mooring line load cells, optical motion tracker and accelerometers and tested in regular waves, short- and long-crested irregular waves and current. The wave and current test regimes were measured by six wave probes and a current meter. Arrays of three and five similar WECs, with identical mooring systems, were tested under similar environmental loading with partial monitoring of mooring forces and motions. The majority of loads on the mooring lines appeared to be broadly consistent with both logistic and normal distribution; whilst the right tail appeared to conform to the extreme value distribution. Comparison of the loads at different configurations of WEC arrays suggests that the results are broadly consistent with the hypothesis that the mooring loads should differ. In particular; the results from the tests in short crested seas conditions give an indication that peak loads in a multi WEC array may be considerably higher than in 1-WEC configuration. The test campaign has contributed essential data to the development of Simulink™ and Orcaflex™ models of devices, which include mooring system interactions, and data have also been obtained for inter-tank comparisons, studies of scale effects and validation of mooring system numerical models. It is hoped that this paper will help to draw the attention of a wider scientific community to the dataset freely available from the

  4. Condom use as part of the wider HIV prevention strategy: experiences from communities in the North West Province, South Africa.

    Science.gov (United States)

    Versteeg, Marije; Murray, Montagu

    2008-07-01

    Correct and consistent condom usage remains a pivotal strategy in reaching the target set by the South African government to reduce new HIV infections by 50% in the next 5 years. Studies have found that there has been an increase in condom usage by some categories of the population, but usage has not yet reached the desired levels in order to meet the target. This article reports on the findings of a study on condom usage in eight communities in the North West Province, which was part of a wider HIV and AIDS programme evaluation commissioned by the North West Provincial Department of Health. The main aim was to assess accessibility to condoms, and knowledge, attitudes and practices around condom use by four sampled communities in the North West Province. Eight focus group discussions were held and 50 households were interviewed. The study found positive results regarding accessibility and awareness of condoms. However, this often did not lead to the desired behavioural change of using condoms in risky sexual interactions. The majority of respondents still resisted condom usage, used condoms inconsistently, or were not in a position to negotiate protected sexual intercourse. The main reasons reported for this were: reduced pleasure, perceived and real physical side-effects, myths, lack of information, status, financial reasons, distrust in the efficacy of condoms, family planning, cultural reasons, gender-related reasons and trust. Many of the barriers to consistent condom use cannot be overcome by strategies that target the individual. Interventions need to address underlying developmental factors such as the non-biological factors that increase the susceptibility of women to HIV infection. As this falls outside of the scope of the mandate of the Department of Health, various partnerships with other key role players need to be established and/or strengthened, such as with local government, non-governmental organisations and faith-based organisations.

  5. Managing workplace stress in community pharmacy organisations: lessons from a review of the wider stress management and prevention literature.

    Science.gov (United States)

    Jacobs, Sally; Johnson, Sheena; Hassell, Karen

    2018-02-01

    Workplace stress in community pharmacy is increasing internationally due, in part, to pharmacists' expanding roles and escalating workloads. Whilst the business case for preventing and managing workplace stress by employers is strong, there is little evidence for the effectiveness of organisational stress management interventions in community pharmacy settings. To identify and synthesise existing evidence for the effectiveness of organisational solutions to workplace stress from the wider organisational literature which may be adaptable to community pharmacies. A secondary synthesis of existing reviews. Publications were identified through keyword searches of electronic databases and the internet; inclusion and exclusion criteria were applied; data about setting, intervention, method of evaluation, effectiveness and conclusions (including factors for success) were extracted and synthesised. Eighteen reviews of the stress management and prevention literature were identified. A comprehensive list of organisational interventions to prevent or manage workplace stress, ordered by prevalence of evidence of effectiveness, was produced, together with an ordered list of the benefits both to the individual and employing organisation. An evidence-based model of best practice was derived specifying eight factors for success: top management support, context-specific interventions, combined organisational and individual interventions, a participative approach, clearly delineated tasks and responsibilities, buy-in from middle management, change agents as facilitators and change in organisational culture. This literature review provides community pharmacy organisations with evidence from which to develop effective and successful stress management strategies to support pharmacists and pharmacy staff. Well-designed trials of stress management interventions in community pharmacy organisations are still required. © 2017 Royal Pharmaceutical Society.

  6. eVITAL: A Preliminary Taxonomy and Electronic Toolkit of Health-Related Habits and Lifestyle

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2012-01-01

    Full Text Available Objectives. To create a preliminary taxonomy and related toolkit of health-related habits (HrH following a person-centered approach with a focus on primary care. Methods. From 2003–2009, a working group (n=6 physicians defined the knowledge base, created a framing document, and selected evaluation tools using an iterative process. Multidisciplinary focus groups (n=29 health professionals revised the document and evaluation protocol and participated in a feasibility study and review of the model based on a demonstration study with 11 adult volunteers in Antequera, Spain. Results. The preliminary taxonomy contains 6 domains of HrH and 1 domain of additional health descriptors, 3 subdomains, 43 dimensions, and 141 subdimensions. The evaluation tool was completed by the 11 volunteers. The eVITAL toolkit contains history and examination items for 4 levels of engagement: self-assessment, basic primary care, extended primary care, and specialty care. There was positive feedback from the volunteers and experts, but concern about the length of the evaluation. Conclusions. We present the first taxonomy of HrH, which may aid the development of the new models of care such as the personal contextual factors of the International Classification of Functioning (ICF and the positive and negative components of the multilevel person-centered integrative diagnosis model.

  7. REST: a toolkit for resting-state functional magnetic resonance imaging data processing.

    Directory of Open Access Journals (Sweden)

    Xiao-Wei Song

    Full Text Available Resting-state fMRI (RS-fMRI has been drawing more and more attention in recent years. However, a publicly available, systematically integrated and easy-to-use tool for RS-fMRI data processing is still lacking. We developed a toolkit for the analysis of RS-fMRI data, namely the RESting-state fMRI data analysis Toolkit (REST. REST was developed in MATLAB with graphical user interface (GUI. After data preprocessing with SPM or AFNI, a few analytic methods can be performed in REST, including functional connectivity analysis based on linear correlation, regional homogeneity, amplitude of low frequency fluctuation (ALFF, and fractional ALFF. A few additional functions were implemented in REST, including a DICOM sorter, linear trend removal, bandpass filtering, time course extraction, regression of covariates, image calculator, statistical analysis, and slice viewer (for result visualization, multiple comparison correction, etc.. REST is an open-source package and is freely available at http://www.restfmri.net.

  8. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  9. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  10. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    Science.gov (United States)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  11. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  12. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    International Nuclear Information System (INIS)

    Zhou, Y.; Zhu, X.; Wang, Y.; Mitra, S.

    2012-01-01

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C 3 H 6 O 6 N 6 ), can be identified to a depth of 20 cm when buried in soil. (author)

  13. Evaluating the parent-adolescent communication toolkit: Usability and preliminary content effectiveness of an online intervention.

    Science.gov (United States)

    Toombs, Elaine; Unruh, Anita; McGrath, Patrick

    2018-01-01

    This study aimed to assess the Parent-Adolescent Communication Toolkit, an online intervention designed to help improve parent communication with their adolescents. Participant preferences for two module delivery systems (sequential and unrestricted module access) were identified. Usability assessment of the PACT intervention was completed using pre-test and posttest comparisons. Usability data, including participant completion and satisfaction ratings were examined. Parents ( N  =   18) of adolescents were randomized to a sequential or unrestricted chapter access group. Parent participants completed pre-test measures, the PACT intervention and posttest measures. Participants provided feedback for the intervention to improve modules and provided usability ratings. Adolescent pre- and posttest ratings were evaluated. Usability ratings were high and parent feedback was positive. The sequential module access groups rated the intervention content higher and completed more content than the unrestricted chapter access group, indicating support for the sequential access design. Parent mean posttest communication scores were significantly higher ( p  Communication Toolkit has potential to improve parent-adolescent communication but further effectiveness assessment is required.

  14. Clinical Trial of a Home Safety Toolkit for Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Kathy J. Horvath

    2013-01-01

    Full Text Available This randomized clinical trial tested a new self-directed educational intervention to improve caregiver competence to create a safer home environment for persons with dementia living in the community. The sample included 108 patient/caregiver dyads: the intervention group (n=60 received the Home Safety Toolkit (HST, including a new booklet based on health literacy principles, and sample safety items to enhance self-efficacy to make home safety modifications. The control group (n=48 received customary care. Participants completed measures at baseline and at twelve-week follow-up. Multivariate Analysis of Covariance (MANCOVA was used to test for significant group differences. All caregiver outcome variables improved in the intervention group more than in the control. Home safety was significant at P≤0.001, caregiver strain at P≤0.001, and caregiver self-efficacy at P=0.002. Similarly, the care receiver outcome of risky behaviors and accidents was lower in the intervention group (P≤0.001. The self-directed use of this Home Safety Toolkit activated the primary family caregiver to make the home safer for the person with dementia of Alzheimer's type (DAT or related disorder. Improving the competence of informal caregivers is especially important for patients with DAT in light of all stakeholders reliance on their unpaid care.

  15. GRHydro: a new open-source general-relativistic magnetohydrodynamics code for the Einstein toolkit

    International Nuclear Information System (INIS)

    Mösta, Philipp; Haas, Roland; Ott, Christian D; Reisswig, Christian; Mundim, Bruno C; Faber, Joshua A; Noble, Scott C; Bode, Tanja; Löffler, Frank; Schnetter, Erik

    2014-01-01

    We present the new general-relativistic magnetohydrodynamics (GRMHD) capabilities of the Einstein toolkit, an open-source community-driven numerical relativity and computational relativistic astrophysics code. The GRMHD extension of the toolkit builds upon previous releases and implements the evolution of relativistic magnetized fluids in the ideal MHD limit in fully dynamical spacetimes using the same shock-capturing techniques previously applied to hydrodynamical evolution. In order to maintain the divergence-free character of the magnetic field, the code implements both constrained transport and hyperbolic divergence cleaning schemes. We present test results for a number of MHD tests in Minkowski and curved spacetimes. Minkowski tests include aligned and oblique planar shocks, cylindrical explosions, magnetic rotors, Alfvén waves and advected loops, as well as a set of tests designed to study the response of the divergence cleaning scheme to numerically generated monopoles. We study the code’s performance in curved spacetimes with spherical accretion onto a black hole on a fixed background spacetime and in fully dynamical spacetimes by evolutions of a magnetized polytropic neutron star and of the collapse of a magnetized stellar core. Our results agree well with exact solutions where these are available and we demonstrate convergence. All code and input files used to generate the results are available on http://einsteintoolkit.org. This makes our work fully reproducible and provides new users with an introduction to applications of the code. (paper)

  16. Adding Impacts and Mitigation Measures to OpenEI's RAPID Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, Erin

    2017-05-01

    The Open Energy Information platform hosts the Regulatory and Permitting Information Desktop (RAPID) Toolkit to provide renewable energy permitting information on federal and state regulatory processes. One of the RAPID Toolkit's functions is to help streamline the geothermal permitting processes outlined in the National Environmental Policy Act (NEPA). This is particularly important in the geothermal energy sector since each development phase requires separate land analysis to acquire exploration, well field drilling, and power plant construction permits. Using the Environmental Assessment documents included in RAPID's NEPA Database, the RAPID team identified 37 resource categories that a geothermal project may impact. Examples include impacts to geology and minerals, nearby endangered species, or water quality standards. To provide federal regulators, project developers, consultants, and the public with typical impacts and mitigation measures for geothermal projects, the RAPID team has provided overview webpages of each of these 37 resource categories with a sidebar query to reference related NEPA documents in the NEPA Database. This project is an expansion of a previous project that analyzed the time to complete NEPA environmental review for various geothermal activities. The NEPA review not only focused on geothermal projects within the Bureau of Land Management and U.S. Forest Service managed lands, but also projects funded by the Department of Energy. Timeline barriers found were: extensive public comments and involvement; content overlap in NEPA documents, and discovery of impacted resources such as endangered species or cultural sites.

  17. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.

  18. 'Ethos' Enabling Organisational Knowledge Creation

    Science.gov (United States)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  19. Employer Toolkit.

    Science.gov (United States)

    Thuli, Kelli J.; Hong, Esther

    This document consists of two guides intended for either employers or service providers involved in school to work partnerships for students with disabilities. "Tools for Service Providers" is intended to be used for training local-level providers who are developing school to work linkages with employers. Following an introduction, this…

  20. Association Between a Wider Availability of Health Information and Health Care Utilization in Vietnam: Cross-Sectional Study.

    Science.gov (United States)

    Nguyen, Hoang Thuy Linh; Nakamura, Keiko; Seino, Kaoruko; Vo, Van Thang

    2017-12-18

    The rapid and widespread development of mass media sources including the Internet is occurring worldwide. Users are being confronted with a flood of health information through a wide availability of sources. Studies on how the availability of health information has triggered users' interest in utilizing health care services remain limited within the Vietnamese population. This study examined the associations between the wider availability of sources for health information and health care utilization in Vietnam after adjusting for potential confounding variables. The data for this study were drawn from a cross-sectional study conducted over a 6-month period in Hue, a city in central Vietnam. The participants were 993 randomly selected adults aged between 18 and 60 years. Information was collected through face-to-face interviews on the types of information sources that were consulted, including traditional media (television), Internet, and health education courses, as well as the impact of such information on health care use (emergency department visits, hospitalizations, doctor visits). Multivariable logistic regression analyses were performed at a 95% confidence level. The prevalence of watching television, using the Internet, and attending health education courses to obtain health information were 50.9% (505/993), 32.9% (327/993), and 8.7% (86/993), respectively. After further adjustments for self-reported health status, the presence of health insurance, and monthly income, respondents who watched television and used the Internet to obtain health information were 1.7 times more likely to visit a doctor (television: adjusted odds ratio [AOR] 1.69, 95% CI 1.30-2.19; Internet: AOR 1.64, 95% CI 1.23-2.19), and also significantly associated with inpatient hospitalization (P=.003). The use of widely available mass media sources (eg, television and the Internet) to obtain health information was associated with higher health care utilization. How this interest in health

  1. Marine spatial planning (MSP: A first step to ecosystem-based management (EBM in the Wider Caribbean

    Directory of Open Access Journals (Sweden)

    John C Ogden

    2010-10-01

    Full Text Available The rapid decline of coastal ecosystems of the Wider Caribbean is entering its fifth decade. Some of the best science documenting this decline and its causes has been done by the laboratories of the Association of Marine Laboratories of the Caribbean (AMLC. Alarmed at the trends, Caribbean conservation pioneers established marine protected areas (MPAs which spread throughout the region. Unfortunately, many have little or no protection and are now known to be too small to be effective in sustaining coastal ecosystems. Marine spatial planning (MSP holds much promise to encompass the large geographic scales of the ecological processes and human impacts that influence coastal ecosystems and adjacent lands. The AMLC, through the scientific expertise and the national political connections of its member institutions, is well-positioned to help implement a pilot project. MSP a first step in ecosystem-based management and has had considerable success elsewhere. It holds our best chance of sustaining human use and conserving the coral reefs and associated ecosystems. Rev. Biol. Trop. 58 (Suppl. 3: 71-79. Epub 2010 October 01.La rápida disminución de los ecosistemas costeros del Mar Caribe está entrando en su quinta década. Algunos de los mejores aportes científicos que documentan este descenso y sus causas han sido realizados por los laboratorios de la Asociación de Laboratorios Marinos del Caribe (ALMC. Alarmados por las tendencias, los pioneros de la conservación del Caribe establecieron áreas marinas protegidas (MPAs que se extendieron por toda la región. Desafortunadamente, muchas de estas áreas tienen poca o ninguna protección y ahora se conoce que son demasiado pequeñas para ser efectivas en el mantenimiento de los ecosistemas costeros. La planificación espacial marina (MSP es promisoria para englobar las grandes escalas geográficas de los procesos ecológicos y los impactos humanos que influyen en los ecosistemas costeros y las

  2. Smart Grid enabled heat pumps

    DEFF Research Database (Denmark)

    Carmo, Carolina; Detlefsen, Nina; Nielsen, Mads Pagh

    2014-01-01

    The transition towards a 100 % fossil-free energy system, while achieving extreme penetration levels of intermittent wind and solar power in electricity generation, requires demand-side technologies that are smart (intermittency-friendly) and efficient. The integration of Smart Grid enabling...... with an empirical study in order to achieve a number of recommendations with respect to technology concepts and control strategies that would allow residential vapor-compression heat pumps to support large-scale integration of intermittent renewables. The analysis is based on data gathered over a period of up to 3...

  3. Effects of toe-in and toe-in with wider step width on level walking knee biomechanics in varus, valgus, and neutral knee alignments.

    Science.gov (United States)

    Bennett, Hunter J; Shen, Guangping; Cates, Harold E; Zhang, Songning

    2017-12-01

    Increased peak external knee adduction moments exist for individuals with knee osteoarthritis and varus knee alignments, compared to healthy and neutrally aligned counterparts. Walking with increased toe-in or increased step width have been individually utilized to successfully reduce 1st and 2nd peak knee adduction moments, respectfully, but have not previously been combined or tested among all alignment groups. The purpose of this study was to compare toe-in only and toe-in with wider step width gait modifications in individuals with neutral, valgus, and varus alignments. Thirty-eight healthy participants with confirmed varus, neutral, or valgus frontal-plane knee alignment through anteroposterior radiographs, performed level walking in normal, toe-in, and toe-in with wider step width gaits. A 3×3 (group×intervention) mixed model repeated measures ANOVA compared alignment groups and gait interventions (pstep width compared to normal gait. The 2nd peak adduction moment was increased in toe-in compared to normal and toe-in with wider step width. The adduction impulse was also reduced in toe-in and toe-in with wider step width compared to normal gait. Peak knee flexion and external rotation moments were increased in toe-in and toe-in with wider step width compared to normal gait. Although the toe-in with wider step width gait seems to be a viable option to reduce peak adduction moments for varus alignments, sagittal, and transverse knee loadings should be monitored when implementing this gait modification strategy. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Enabling a Mobile Workforce: How to Implement Effective Teleworking at U.S. Department of Energy National Laboratories - A Guidebook and Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Myers, Lissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hall, Cheri [National Energy Technology Lab. (NETL), Albany, OR (United States); Rambo, Christian [Dept. of Energy (DOE), Washington DC (United States). Sustainability Performance Office; Sikes, Karen [CSRA Inc., Knoxville, TN (United States); Rukavina, Frank [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ischay, Christopher [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoddard Conrad, Emily [Dept. of Energy (DOE), Washington DC (United States). Sustainability Performance Office; Bender, Sadie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moran, Mike [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Williams, Jeffrey [Brookhaven National Lab. (BNL), Upton, NY (United States); Nichols, Teresa A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ahl, Amanda G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-06-01

    Teleworking, also known as telecommuting, has grown in popularity in today’s workforce, evolving from an employment perk to a business imperative. Facilitated by improved mobile connectivity and ease of remote access, employees and organizations are increasingly embracing teleworking.

  5. Preparing for the Flu (Including 2009 H1N1 Flu): A Communication Toolkit for Schools (Grades K-12)

    Science.gov (United States)

    Centers for Disease Control and Prevention, 2010

    2010-01-01

    The purpose of "Preparing for the Flu: A Communication Toolkit for Schools" is to provide basic information and communication resources to help school administrators implement recommendations from CDC's (Centers for Disease Control and Prevention) Guidance for State and Local Public Health Officials and School Administrators for School (K-12)…

  6. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL

    Directory of Open Access Journals (Sweden)

    Walsh Carolyn O

    2010-03-01

    Full Text Available Abstract Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672

  7. TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4

    NARCIS (Netherlands)

    Sharples, Paul; Popat, Kris; Llobet, Lau; Santos, Patricia; Hernández-Leo, Davinia; Miao, Yongwu; Griffiths, David; Beauvoir, Phillip

    2010-01-01

    Sharples, P., Popat, K., Llobet, L., Santos, P., Hernandez-Leo, D., Miao, Y., Griffiths, D. & Beauvoir, P. (2009) TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4 This release is composed of three files corresponding to CopperCore Service Integration (CCSI) v3.2-10cv1.4,

  8. Cyber security awareness toolkit for national security: An approach to South Africa’s cybersecurity policy implementation

    CSIR Research Space (South Africa)

    Phahlamohlaka, LJ

    2011-05-01

    Full Text Available The aim of this paper is to propose an approach that South Africa could follow in implementing its proposed Cyber security policy. The paper proposes a Cyber Security Awareness Toolkit that is underpinned by key National Security imperatives as well...

  9. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    Science.gov (United States)

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  10. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Improved National Response to Climate Change: Aligning USGCRP reports and the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Lipschultz, F.; Dahlman, L. E.; Herring, D.; Fox, J. F.

    2017-12-01

    As part of an effort to coordinate production and distribution of scientific climate information across the U.S. Government, and to spur adaptation actions across the nation, the U.S. Global Change Research Program (USGCRP) has worked to better integrate the U.S. Climate Resilience Toolkit (CRT) and its Climate Explorer (CE) tool into USGCRP activities and products. Much of the initial CRT content was based on the Third National Climate Assessment (NCA3). The opportunity to integrate current development of NCA4—scheduled for release in late 2018—with CRT and CE can enhance all three projects and result in a useable and "living" NCA that is part of USGCRP's approach to sustained climate assessment. To coordinate this work, a USGCRP-led science team worked with CRT staff and CE developers to update the set of climate projections displayed in the CE tool. In concert with the USGCRP scenarios effort, the combined team selected the Localized Constructed Analogs (LOCA) dataset for the updated version of CE, based on its capabilities for capturing climate extremes and local climate variations. The team identified 28 variables from the LOCA dataset for display in the CE; many of these variables will also be used in USGCRP reports. In CRT engagements, communities with vulnerable assets have expressed a high value for the ability to integrate climate data available through the CE with data related to non-climate stressors in their locations. Moving forward, the teams intend to serve climate information needs at additional spatial scales by making NCA4 content available via CE's capability for dynamic interaction with climate-relevant datasets. This will permit users to customize the extent of data they access for decision-making, starting with the static NCA4 report. Additionally, NCA4 case studies and other content can be linked to more in-depth content within the CRT site. This capability will enable more frequent content updates than can be managed with quadrennial

  12. An internet-based bioinformatics toolkit for plant biosecurity diagnosis and surveillance of viruses and viroids.

    Science.gov (United States)

    Barrero, Roberto A; Napier, Kathryn R; Cunnington, James; Liefting, Lia; Keenan, Sandi; Frampton, Rebekah A; Szabo, Tamas; Bulman, Simon; Hunter, Adam; Ward, Lisa; Whattam, Mark; Bellgard, Matthew I

    2017-01-11

    Detection and preventing entry of exotic viruses and viroids at the border is critical for protecting plant industries trade worldwide. Existing post entry quarantine screening protocols rely on time-consuming biological indicators and/or molecular assays that require knowledge of infecting viral pathogens. Plants have developed the ability to recognise and respond to viral infections through Dicer-like enzymes that cleave viral sequences into specific small RNA products. Many studies reported the use of a broad range of small RNAs encompassing the product sizes of several Dicer enzymes involved in distinct biological pathways. Here we optimise the assembly of viral sequences by using specific small RNA subsets. We sequenced the small RNA fractions of 21 plants held at quarantine glasshouse facilities in Australia and New Zealand. Benchmarking of several de novo assembler tools yielded SPAdes using a kmer of 19 to produce the best assembly outcomes. We also found that de novo assembly using 21-25 nt small RNAs can result in chimeric assemblies of viral sequences and plant host sequences. Such non-specific assemblies can be resolved by using 21-22 nt or 24 nt small RNAs subsets. Among the 21 selected samples, we identified contigs with sequence similarity to 18 viruses and 3 viroids in 13 samples. Most of the viruses were assembled using only 21-22 nt long virus-derived siRNAs (viRNAs), except for one Citrus endogenous pararetrovirus that was more efficiently assembled using 24 nt long viRNAs. All three viroids found in this study were fully assembled using either 21-22 nt or 24 nt viRNAs. Optimised analysis workflows were customised within the Yabi web-based analytical environment. We present a fully automated viral surveillance and diagnosis web-based bioinformatics toolkit that provides a flexible, user-friendly, robust and scalable interface for the discovery and diagnosis of viral pathogens. We have implemented an automated viral surveillance and

  13. An interactive toolkit to extract phenological time series data from digital repeat photography

    Science.gov (United States)

    Seyednasrollah, B.; Milliman, T. E.; Hufkens, K.; Kosmala, M.; Richardson, A. D.

    2017-12-01

    Near-surface remote sensing and in situ photography are powerful tools to study how climate change and climate variability influence vegetation phenology and the associated seasonal rhythms of green-up and senescence. The rapidly-growing PhenoCam network has been using in situ digital repeat photography to study phenology in almost 500 locations around the world, with an emphasis on North America. However, extracting time series data from multiple years of half-hourly imagery - while each set of images may contain several regions of interest (ROI's), corresponding to different species or vegetation types - is not always straightforward. Large volumes of data require substantial processing time, and changes (either intentional or accidental) in camera field of view requires adjustment of ROI masks. Here, we introduce and present "DrawROI" as an interactive web-based application for imagery from PhenoCam. DrawROI can also be used offline, as a fully independent toolkit that significantly facilitates extraction of phenological data from any stack of digital repeat photography images. DrawROI provides a responsive environment for phenological scientists to interactively a) delineate ROIs, b) handle field of view (FOV) shifts, and c) extract and export time series data characterizing image color (i.e. red, green and blue channel digital numbers for the defined ROI). The application utilizes artificial intelligence and advanced machine learning techniques and gives user the opportunity to redraw new ROIs every time an FOV shift occurs. DrawROI also offers a quality control flag to indicate noisy data and images with low quality due to presence of foggy weather or snow conditions. The web-based application significantly accelerates the process of creating new ROIs and modifying pre-existing ROI in the PhenoCam database. The offline toolkit is presented as an open source R-package that can be used with similar datasets with time-lapse photography to obtain more data for

  14. Cyber-Enabled Scientific Discovery

    International Nuclear Information System (INIS)

    Chan, Tony; Jameson, Leland

    2007-01-01

    It is often said that numerical simulation is third in the group of three ways to explore modern science: theory, experiment and simulation. Carefully executed modern numerical simulations can, however, be considered at least as relevant as experiment and theory. In comparison to physical experimentation, with numerical simulation one has the numerically simulated values of every field variable at every grid point in space and time. In comparison to theory, with numerical simulation one can explore sets of very complex non-linear equations such as the Einstein equations that are very difficult to investigate theoretically. Cyber-enabled scientific discovery is not just about numerical simulation but about every possible issue related to scientific discovery by utilizing cyberinfrastructure such as the analysis and storage of large data sets, the creation of tools that can be used by broad classes of researchers and, above all, the education and training of a cyber-literate workforce

  15. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  16. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  17. Context-Enabled Business Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Troy Hiltbrand

    2012-04-01

    To truly understand context and apply it in business intelligence, it is vital to understand what context is and how it can be applied in addressing organizational needs. Context describes the facets of the environment that impact the way that end users interact with the system. Context includes aspects of location, chronology, access method, demographics, social influence/ relationships, end-user attitude/ emotional state, behavior/ past behavior, and presence. To be successful in making Business Intelligence content enabled, it is important to be able to capture the context of use user. With advances in technology, there are a number of ways in which this user based information can be gathered and exposed to enhance the overall end user experience.

  18. Informatics enables public health surveillance

    Directory of Open Access Journals (Sweden)

    Scott J. N McNabb

    2017-01-01

    Full Text Available Over the past decade, the world has radically changed. New advances in information and communication technologies (ICT connect the world in ways never imagined. Public health informatics (PHI leveraged for public health surveillance (PHS, can enable, enhance, and empower essential PHS functions (i.e., detection, reporting, confirmation, analyses, feedback, response. However, the tail doesn't wag the dog; as such, ICT cannot (should not drive public health surveillance strengthening. Rather, ICT can serve PHS to more effectively empower core functions. In this review, we explore promising ICT trends for prevention, detection, and response, laboratory reporting, push notification, analytics, predictive surveillance, and using new data sources, while recognizing that it is the people, politics, and policies that most challenge progress for implementation of solutions.

  19. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  20. GEANT 4: an Object-Oriented toolkit for simulation in HEP

    CERN Multimedia

    Kent, P; Sirotenko, V; Komogorov, M; Pavliouk, A; Greeniaus, G L; Kayal, P I; Routenburg, P; Tanaka, S; Duellmann, D; Innocente, V; Paoli, S; Ranjard, F; Riccardi, F; Ruggier, M; Shiers, J; Egli, S; Kimura, A; Urban, P; Prior, S; Walkden, A; Forti, A; Magni, S; Strahl, K; Kokoulin, R; Braune, K; Volcker, C; Ullrich, T; Takahata, M; Nieminen, P; Ballocchi, G; Mora De Freitas, P; Verderi, M; Rybine, A; Langeveld, W; Nagamatsu, M; Hamatsu, R; Katayama, N; Chuma, J; Felawka, L; Gumplinger, P; Axen, D

    2002-01-01

    %RD44 %title\\\\ \\\\The GEANT4 software has been developed by a world-wide collaboration of about 100 scientists from over 40 institutions and laboratories participating in more than 10 experiments in Europe, Russia, Japan, Canada, and the United States. The GEANT4 detector simulation toolkit has been designed for the next generation of High Energy Physics (HEP) experiments, with primary requirements from the LHC, the CP violation, and the heavy ions experiments. In addition, GEANT4 also meets the requirements from the space and medical communities, thanks to very low energy extensions developed in a joint project with the European Space Agency (ESA). GEANT4 has exploited advanced software engineering techniques (for example PSS-05) and Object-Oriented technology to improve the validation process of the physics results, and in the same time to make possible the distributed software design and development in the world-wide collaboration. Fifteen specialised working groups have been responsible for fields as diver...

  1. Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Merzari, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Obabko, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States); Tautges, Timothy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Solberg, Jerome [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferencz, Robert Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Whitesides, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-21

    This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.

  2. An assessment toolkit to increase the resilience of NWE catchments to periods of drought

    Science.gov (United States)

    La Jeunesse, Isabelle; Larrue, Corinne

    2013-04-01

    In many North Western Europe (NWE) areas the balance between water demand and availability is under pressure, thus under water scarcity. In addition, NWE areas are adversely affected by changes in the hydrological cycle and precipitation patterns, thus droughts periods. Over the past thirty years, droughts have dramatically increased and NWE are not immune. The summer of 2003 caused 10 billion euro damage to agriculture. In April 2012 the South West of the UK has moved to environmental drought status. Water scarcity and drought problems in the EU are increasing: 11% of the European population and 17% of its territory have been affected to date. Climate change is likely to exacerbate these adverse impacts. 50% of the NWE area are planned to be affected in 2050. Although the problems caused by drought in NWE are currently not overwhelmingly visible early action should be taken to reduce costs and prevent damage. Adapting to drought in NWE is the transnational challenge of the DROP (governance in DROught adaPtation) project. The Commission's recent "Blue Print on European Waters" states that existing policies are good but the problem lays in implementation. So the future challenge for NWE regions is to improve the implementation, meaning both governance and measures. The problem of drought is relatively new in comparison with flooding for these Regions. This demands another approach with the interaction of different stakeholders. NWE countries have proven strategies for flood prevention; no such strategies exist for drought adaptation. To do this, DROP combines science, practitioners and decisions makers, realizing the science-policy window. Thus, the aim of the DROP project is to increase the resilience of NWE catchments to periods of drought. To tackle these issues DROP will develop a governance toolkit to be used by NWE regional water authorities and will test a few pilot measures on drought adaptation. The objectives of the project are 1) to promote the use of a

  3. The Communities Advancing Resilience Toolkit (CART): an intervention to build community resilience to disasters.

    Science.gov (United States)

    Pfefferbaum, Rose L; Pfefferbaum, Betty; Van Horn, Richard L; Klomp, Richard W; Norris, Fran H; Reissman, Dori B

    2013-01-01

    Community resilience has emerged as a construct to support and foster healthy individual, family, and community adaptation to mass casualty incidents. The Communities Advancing Resilience Toolkit (CART) is a publicly available theory-based and evidence-informed community intervention designed to enhance community resilience by bringing stakeholders together to address community issues in a process that includes assessment, feedback, planning, and action. Tools include a field-tested community resilience survey and other assessment and analytical instruments. The CART process encourages public engagement in problem solving and the development and use of local assets to address community needs. CART recognizes 4 interrelated domains that contribute to community resilience: connection and caring, resources, transformative potential, and disaster management. The primary value of CART is its contribution to community participation, communication, self-awareness, cooperation, and critical reflection and its ability to stimulate analysis, collaboration, skill building, resource sharing, and purposeful action.

  4. Advancements in Wind Integration Study Data Modeling: The Wind Integration National Dataset (WIND) Toolkit; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C.; Hodge, B. M.; Orwig, K.; Jones, W.; Searight, K.; Getman, D.; Harrold, S.; McCaa, J.; Cline, J.; Clark, C.

    2013-10-01

    Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather prediction model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.

  5. Integrating the protein and metabolic engineering toolkits for next-generation chemical biosynthesis.

    Science.gov (United States)

    Pirie, Christopher M; De Mey, Marjan; Jones Prather, Kristala L; Ajikumar, Parayil Kumaran

    2013-04-19

    Through microbial engineering, biosynthesis has the potential to produce thousands of chemicals used in everyday life. Metabolic engineering and synthetic biology are fields driven by the manipulation of genes, genetic regulatory systems, and enzymatic pathways for developing highly productive microbial strains. Fundamentally, it is the biochemical characteristics of the enzymes themselves that dictate flux through a biosynthetic pathway toward the product of interest. As metabolic engineers target sophisticated secondary metabolites, there has been little recognition of the reduced catalytic activity and increased substrate/product promiscuity of the corresponding enzymes compared to those of central metabolism. Thus, fine-tuning these enzymatic characteristics through protein engineering is paramount for developing high-productivity microbial strains for secondary metabolites. Here, we describe the importance of protein engineering for advancing metabolic engineering of secondary metabolism pathways. This pathway integrated enzyme optimization can enhance the collective toolkit of microbial engineering to shape the future of chemical manufacturing.

  6. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  7. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    Directory of Open Access Journals (Sweden)

    Hamelryck Thomas

    2010-03-01

    Full Text Available Abstract Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs. It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations. Results The program package is freely available under the GNU General Public Licence (GPL from SourceForge http://sourceforge.net/projects/mocapy. The package contains the source for building the Mocapy++ library, several usage examples and the user manual. Conclusions Mocapy++ is especially suitable for constructing probabilistic models of biomolecular structure, due to its support for directional statistics. In particular, it supports the Kent distribution on the sphere and the bivariate von Mises distribution on the torus. These distributions have proven useful to formulate probabilistic models of protein and RNA structure in atomic detail.

  8. Modular toolkit for Data Processing (MDP: a Python data processing framework

    Directory of Open Access Journals (Sweden)

    Tiziano Zito

    2009-01-01

    Full Text Available Modular toolkit for Data Processing (MDP is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  9. CHASM and SNVBox: toolkit for detecting biologically important single nucleotide mutations in cancer.

    Science.gov (United States)

    Wong, Wing Chung; Kim, Dewey; Carter, Hannah; Diekhans, Mark; Ryan, Michael C; Karchin, Rachel

    2011-08-01

    Thousands of cancer exomes are currently being sequenced, yielding millions of non-synonymous single nucleotide variants (SNVs) of possible relevance to disease etiology. Here, we provide a software toolkit to prioritize SNVs based on their predicted contribution to tumorigenesis. It includes a database of precomputed, predictive features covering all positions in the annotated human exome and can be used either stand-alone or as part of a larger variant discovery pipeline. MySQL database, source code and binaries freely available for academic/government use at http://wiki.chasmsoftware.org, Source in Python and C++. Requires 32 or 64-bit Linux system (tested on Fedora Core 8,10,11 and Ubuntu 10), 2.5*≤ Python 5.0, 60 GB available hard disk space (50 MB for software and data files, 40 GB for MySQL database dump when uncompressed), 2 GB of RAM.

  10. Assessing Chinese coach drivers' fitness to drive: The development of a toolkit based on cognition measurements.

    Science.gov (United States)

    Wang, Huarong; Mo, Xian; Wang, Ying; Liu, Ruixue; Qiu, Peiyu; Dai, Jiajun

    2016-10-01

    Road traffic accidents resulting in group deaths and injuries are often related to coach drivers' inappropriate operations and behaviors. Thus, the evaluation of coach drivers' fitness to drive is an important measure for improving the safety of public transportation. Previous related research focused on drivers' age and health condition. Comprehensive studies about commercial drivers' cognitive capacities are limited. This study developed a toolkit consisting of nine cognition measurements across driver perception/sensation, attention, and reaction. A total of 1413 licensed coach drivers in Jiangsu Province, China were investigated and tested. Results indicated that drivers with accident history within three years performed overwhelmingly worse (panalysis, in which the eliminated 5% tail was calculated from on integrated index. Methods to categorizing qualified, good, and excellent coach drivers and criteria for evaluating and training Chinese coach drivers' fitness to drive were also proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. DYNECHARM++: a toolkit to simulate coherent interactions of high-energy charged particles in complex structures

    Science.gov (United States)

    Bagli, Enrico; Guidi, Vincenzo

    2013-08-01

    A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.

  12. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi

    2017-10-23

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  13. From toolkit to framework: The past and future evolution of PhEDEx

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Hernandez, A. [CINVESTAV, IPN; Egeland, R. [Argosy U., Eagan; Huang, C. H. [Fermilab; Ratnikova, N. [Moscow, ITEP; Magini, N. [CERN; Wildish, T. [Princeton U.

    2012-01-01

    PhEDEx is the data-movement solution for CMS at the LHC. Created in 2004, it is now one of the longest-lived components of the CMS dataflow/workflow world. As such, it has undergone significant evolution over time, and continues to evolve today, despite being a fully mature system. Originally a toolkit of agents and utilities dedicated to specific tasks, it is becoming a more open framework that can be used in several ways, both within and beyond its original problem domain. In this talk we describe how a combination of refactoring and adoption of new technologies that have become available over the years have made PhEDEx more flexible, maintainable, and scaleable.

  14. PDB@: an offline toolkit for exploration and analysis of PDB files.

    Science.gov (United States)

    Mani, Udayakumar; Ravisankar, Sadhana; Ramakrishnan, Sai Mukund

    2013-12-01

    Protein Data Bank (PDB) is a freely accessible archive of the 3-D structural data of biological molecules. Structure based studies offers a unique vantage point in inferring the properties of a protein molecule from structural data. This is too big a task to be done manually. Moreover, there is no single tool, software or server that comprehensively analyses all structure-based properties. The objective of the present work is to develop an offline computational toolkit, PDB@ containing in-built algorithms that help categorizing the structural properties of a protein molecule. The user has the facility to view and edit the PDB file to his need. Some features of the present work are unique in itself and others are an improvement over existing tools. Also, the representation of protein properties in both graphical and textual formats helps in predicting all the necessary details of a protein molecule on a single platform.

  15. A flooding induced station blackout analysis for a pressurized water reactor using the RISMC toolkit

    International Nuclear Information System (INIS)

    Mandelli, Diego; Prescott, Steven; Smith, Curtis; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua; Kinoshita, Robert

    2015-01-01

    In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation) and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code called NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. The impact of power uprate is determined in terms of both core damage probability and safety margins

  16. Wider-community Segregation and the Effect of Neighbourhood Ethnic Diversity on Social Capital: An Investigation into Intra-Neighbourhood Trust in Great Britain and London.

    Science.gov (United States)

    Laurence, James

    2017-10-01

    Extensive research has demonstrated that neighbourhood ethnic diversity is negatively associated with intra-neighbourhood social capital. This study explores the role of segregation and integration in this relationship. To do so it applies three-level hierarchical linear models to two sets of data from across Great Britain and within London, and examines how segregation across the wider-community in which a neighbourhood is nested impacts trust amongst neighbours. This study replicates the increasingly ubiquitous finding that neighbourhood diversity is negatively associated with neighbour-trust. However, we demonstrate that this relationship is highly dependent on the level of segregation across the wider-community in which a neighbourhood is nested. Increasing neighbourhood diversity only negatively impacts neighbour-trust when nested in more segregated wider-communities. Individuals living in diverse neighbourhoods nested within integrated wider-communities experience no trust-penalty. These findings show that segregation plays a critical role in the neighbourhood diversity/trust relationship, and that its absence from the literature biases our understanding of how ethnic diversity affects social cohesion.

  17. A tetO Toolkit To Alter Expression of Genes in Saccharomyces cerevisiae.

    Science.gov (United States)

    Cuperus, Josh T; Lo, Russell S; Shumaker, Lucia; Proctor, Julia; Fields, Stanley

    2015-07-17

    Strategies to optimize a metabolic pathway often involve building a large collection of strains, each containing different versions of sequences that regulate the expression of pathway genes. Here, we develop reagents and methods to carry out this process at high efficiency in the yeast Saccharomyces cerevisiae. We identify variants of the Escherichia coli tet operator (tetO) sequence that bind a TetR-VP16 activator with differential affinity and therefore result in different TetR-VP16 activator-driven expression. By recombining these variants upstream of the genes of a pathway, we generate unique combinations of expression levels. Here, we built a tetO toolkit, which includes the I-OnuI homing endonuclease to create double-strand breaks, which increases homologous recombination by 10(5); a plasmid carrying six variant tetO sequences flanked by I-OnuI sites, uncoupling transformation and recombination steps; an S. cerevisiae-optimized TetR-VP16 activator; and a vector to integrate constructs into the yeast genome. We introduce into the S. cerevisiae genome the three crt genes from Erwinia herbicola required for yeast to synthesize lycopene and carry out the recombination process to produce a population of cells with permutations of tetO variants regulating the three genes. We identify 0.7% of this population as making detectable lycopene, of which the vast majority have undergone recombination at all three crt genes. We estimate a rate of ∼20% recombination per targeted site, much higher than that obtained in other studies. Application of this toolkit to medically or industrially important end products could reduce the time and labor required to optimize the expression of a set of metabolic genes.

  18. The Climate Resilience Toolkit: Central gateway for risk assessment and resilience planning at all governance scales

    Science.gov (United States)

    Herring, D.; Lipschultz, F.

    2016-12-01

    As people and organizations grapple with a changing climate amid a range of other factors simultaneously shifting, there is a need for credible, legitimate & salient scientific information in useful formats. In addition, an assessment framework is needed to guide the process of planning and implementing projects that allow communities and businesses to adapt to specific changing conditions, while also building overall resilience to future change. We will discuss how the U.S. Climate Resilience Toolkit (CRT) can improve people's ability to understand and manage their climate-related risks and opportunities, and help them make their communities and businesses more resilient. In close coordination with the U.S. Climate Data Initiative, the CRT is continually evolving to offer actionable authoritative information, relevant tools, and subject matter expertise from across the U.S. federal government in one easy-to-use location. The Toolkit's "Climate Explorer" is designed to help people understand potential climate conditions over the course of this century. It offers easy access to downloadable maps, graphs, and data tables of observed and projected temperature, precipitation and other decision-relevant climate variables dating back to 1950 and out to 2100. Since climate is only one of many changing factors affecting decisions about the future, it also ties climate information to a wide range of relevant variables to help users explore vulnerabilities and impacts. New topic areas have been added, such as "Fisheries," "Regions," and "Built Environment" sections that feature case studies and personal experiences in making adaptation decisions. A curated "Reports" section is integrated with semantic web capabilities to help users locate the most relevant information sources. As part of the USGCRP's sustained assessment process, the CRT is aligning with other federal activities, such as the upcoming 4th National Climate Assessment.

  19. Health Equity Assessment Toolkit (HEAT: software for exploring and comparing health inequalities in countries

    Directory of Open Access Journals (Sweden)

    Ahmad Reza Hosseinpoor

    2016-10-01

    Full Text Available Abstract Background It is widely recognised that the pursuit of sustainable development cannot be accomplished without addressing inequality, or observed differences between subgroups of a population. Monitoring health inequalities allows for the identification of health topics where major group differences exist, dimensions of inequality that must be prioritised to effect improvements in multiple health domains, and also population subgroups that are multiply disadvantaged. While availability of data to monitor health inequalities is gradually improving, there is a commensurate need to increase, within countries, the technical capacity for analysis of these data and interpretation of results for decision-making. Prior efforts to build capacity have yielded demand for a toolkit with the computational ability to display disaggregated data and summary measures of inequality in an interactive and customisable fashion that would facilitate interpretation and reporting of health inequality in a given country. Methods To answer this demand, the Health Equity Assessment Toolkit (HEAT, was developed between 2014 and 2016. The software, which contains the World Health Organization’s Health Equity Monitor database, allows the assessment of inequalities within a country using over 30 reproductive, maternal, newborn and child health indicators and five dimensions of inequality (economic status, education, place of residence, subnational region and child’s sex, where applicable. Results/Conclusion HEAT was beta-tested in 2015 as part of ongoing capacity building workshops on health inequality monitoring. This is the first and only application of its kind; further developments are proposed to introduce an upload data feature, translate it into different languages and increase interactivity of the software. This article will present the main features and functionalities of HEAT and discuss its relevance and use for health inequality monitoring.

  20. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    International Nuclear Information System (INIS)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J; Perl, J; Piersimoni, P; Ramos-Mendez, J; Faddegon, B

    2016-01-01

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex