WorldWideScience

Sample records for saphire tool set

  1. The atmosphere simulation chamber SAPHIR: a tool for the investigation of photochemistry.

    Science.gov (United States)

    Brauers, T.; Bohn, B.; Johnen, F.-J.; Rohrer, R.; Rodriguez Bares, S.; Tillmann, R.; Wahner, A.

    2003-04-01

    On the campus of the Forschungszentrum Jülich we constructed SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) which was accomplished in fall 2001. The chamber consists of a 280-m^3 double-wall Teflon bag of cylindrical shape that is held by a steel frame. Typically 75% of the outside actinic flux (290~nm~--~420~nm) is available inside the chamber. A louvre system allows switching between full sun light and dark within 40 s giving the opportunity to study relaxation processes of the photo chemical system. The SAPHIR chamber is equipped with a comprehensive set of sensitive instruments including the measurements of OH, HO_2, CO, hydrocarbons, aldehydes, nitrogen-oxides and solar radiation. Moreover, the modular concept of SAPHIR allows fast and flexible integration of new instruments and techniques. In this paper we will show the unique and new features of the SAPHIR chamber, namely the clean air supply and high purity water vapor supply providing a wide range of trace gas concentrations being accessible through the experiments. We will also present examples from the first year of SAPHIR experiment showing the scope of application from high quality instrument inter-comparison and kinetic studies to the simulation of complex mixtures of trace gases at ambient concentrations.

  2. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  3. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE), Version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Hoffman, C.L.

    1995-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Graphical Evaluation Module (GEM) is a special application tool designed for evaluation of operational occurrences using the Accident Sequence Precursor (ASP) program methods. GEM provides the capability for an analyst to quickly and easily perform conditional core damage probability (CCDP) calculations. The analyst can then use the CCDP calculations to determine if the occurrence of an initiating event or a condition adversely impacts safety. It uses models and data developed in the SAPHIRE specially for the ASP program. GEM requires more data than that normally provided in SAPHIRE and will not perform properly with other models or data bases. This is the first release of GEM and the developers of GEM welcome user comments and feedback that will generate ideas for improvements to future versions. GEM is designated as version 5.0 to track GEM codes along with the other SAPHIRE codes as the GEM relies on the same, shared database structure

  4. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  5. SAPHIR, how it ended

    International Nuclear Information System (INIS)

    Brogli, R.; Hammer, J.; Wiezel, L.; Christen, R.; Heyck, H.; Lehmann, E.

    1995-01-01

    On May 16th, 1994, PSI decided to discontinue its efforts to retrofit the SAPHIR reactor for operation at 10 MW. This decision was made because the effort and time for the retrofit work in progress had proven to be more complex than was anticipated. In view of the start-up of the new spallation-neutron source SINQ in 1996, the useful operating time between the eventual restart of SAPHIR and the start-up of SINQ became less than two years, which was regarded by PSI as too short a period to warrant the large retrofit effort. Following the decision of PSI not to re-use SAPHIR as a neutron source, several options for the further utilization of the facility were open. However, none of them appeared promising in comparison with other possibilities; it was therefore decided that SAPHIR should be decommissioned. A concerted effort was initiated to consolidate the nuclear and conventional safety for the post-operational period. (author) 3 figs., 3 tab

  6. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE)

    International Nuclear Information System (INIS)

    C. L. Smith

    2006-01-01

    available in SAPHIRE and presents general instructions for using the software. Section 1 presents SAPHIRE's historical evolution and summarizes its capabilities. Section 2 presents instructions for installing and using the code. Section 3 explains the database structure used in SAPHIRE and discusses database concepts. Section 4 explains how PRA data (event frequencies, human error probabilities, etc.) can be generated and manipulated using ''change sets''. Section 5 deals with fault tree operations, including constructing, editing, solving, and displaying results. Section 6 presents operations associated with event trees, including rule application for event tree linking, partitioning, and editing sequences. Section 7 presents how accident sequences are generated, solved, quantified, and analyzed. Section 8 discusses the functions available for performing end state analysis. Section 9 explains how to modify data stored in a SAPHIRE database. Section 10 illustrates how to generate and customize reports. Section 11 covers SAPHIRE utility options to perform routine functions such as defining constant values, recovering databases, and loading data from external sources. Section 12 provides an overview of GEM's features and capabilities. Finally, Section 13 summarizes SAPHIRE's quality assurance process

  7. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  8. Assimilation of SAPHIR radiance: impact on hyperspectral radiances in 4D-VAR

    Science.gov (United States)

    Indira Rani, S.; Doherty, Amy; Atkinson, Nigel; Bell, William; Newman, Stuart; Renshaw, Richard; George, John P.; Rajagopal, E. N.

    2016-04-01

    Assimilation of a new observation dataset in an NWP system may affect the quality of an existing observation data set against the model background (short forecast), which in-turn influence the use of an existing observation in the NWP system. Effect of the use of one data set on the use of another data set can be quantified as positive, negative or neutral. Impact of the addition of new dataset is defined as positive if the number of assimilated observations of an existing type of observation increases, and bias and standard deviation decreases compared to the control (without the new dataset) experiment. Recently a new dataset, Megha Tropiques SAPHIR radiances, which provides atmospheric humidity information, is added in the Unified Model 4D-VAR assimilation system. In this paper we discuss the impact of SAPHIR on the assimilation of hyper-spectral radiances like AIRS, IASI and CrIS. Though SAPHIR is a Microwave instrument, its impact can be clearly seen in the use of hyper-spectral radiances in the 4D-VAR data assimilation systems in addition to other Microwave and InfraRed observation. SAPHIR assimilation decreased the standard deviation of the spectral channels of wave number from 650 -1600 cm-1 in all the three hyperspectral radiances. Similar impact on the hyperspectral radiances can be seen due to the assimilation of other Microwave radiances like from AMSR2 and SSMIS Imager.

  9. SAPHIRE6.64, System Analysis Programs for Hands-on Integrated Reliability

    International Nuclear Information System (INIS)

    2001-01-01

    1 - Description of program or function: SAPHIRE is a collection of programs developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA) primarily for nuclear power plants. The programs included in this suite are the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P and ID (FEP) editors. Previously these programs were released as separate packages. These programs include functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to perform uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. 2 - Methods: SAPHIRE is written in MODULA-2 and uses an integrated commercial graphics package to interactively construct and edit fault trees. The fault tree solving methods used are industry recognized top down algorithms. For quantification, the program uses standard methods to propagate the failure information through the generated cut sets. SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE which automates the process for evaluating operational events at commercial nuclear power plants. Using GEM an analyst can estimate the risk associated with operational events (that is, perform a Level 1, Level 2, and Level 3 analysis for operational events) in a very efficient and expeditious manner. This on-line reference guide will

  10. SAPHIRE 8 Volume 1 - Overview and Summary

    International Nuclear Information System (INIS)

    Smith, C.L.; Wood, S.T.

    2011-01-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system's response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE 8 can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which leads to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for managing models such as flooding and fire. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). In SAPHIRE 8, the act of creating a model has been separated from the analysis of that model in order to improve the quality of both the model (e.g., by avoiding inadvertent changes) and the analysis. Consequently, in SAPHIRE 8, the analysis of models is performed by using what are called Workspaces. Currently, there are Workspaces for three types of analyses: (1) the NRC's Accident Sequence Precursor program, where the workspace is called 'Events and Condition Assessment (ECA);' (2) the NRC's Significance Determination Process (SDP); and

  11. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  12. SAPHIR, a simulator for engineering and training on N4-type nuclear power plants

    International Nuclear Information System (INIS)

    Vovan, C.

    1999-01-01

    SAPHIR, the new simulator developed by FRAMATOME, has been designed to be a convenient tool for engineering and training for different types of nuclear power plants. Its first application is for the French 'N4' four-loop 1500MWe PWR. The basic features of SAPHIR are: (1) Use of advanced codes for modelling He primary and secondary systems' including an axial steam generator model, (2) Use of a simulation workshop containing different tools for modelling fluid, electrical, instrument and control networks, (3) A Man-Machine Interface designed for an easy and convivial use which can simulate the different computerized control consoles of the 'N4' control room. This paper outlines features and capabilities of this tool, both for engineering and training purposes. (author)

  13. SAPHIRE 8 Volume 3 - Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. Vedros; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). This reference guide will introduce the SAPHIRE Version 8.0 software. A brief discussion of the purpose and history of the software is included along with general information such as installation instructions, starting and stopping the program, and some pointers on how to get around inside the program. Next, database concepts and structure are discussed. Following that discussion are nine sections, one for each of the menu options on the SAPHIRE main menu, wherein the purpose and general capabilities for each option are

  14. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  15. SAPHIRE 8 Volume 6 - Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 8 is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows™ operating system. SAPHIRE 8 is funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 8, what constitutes its parts, and limitations of those processes. In addition, this document describes the Independent Verification and Validation that was conducted for Version 8 as part of an overall QA process.

  16. Strangeness photoproduction with the SAPHIR-detector

    International Nuclear Information System (INIS)

    Merkel, H.

    1993-12-01

    At the ELSA facility at Bonn a photon beam with a high duty cycle up to energies of 3.3 GeV is available. In this energy range the large solid angle detector SAPHIR enables us to investigate the strangeness photoproduction starting from threshold. SAPHIR has already achieved results for the reactions γ+p→K + +Λ and γ+p→K + +Σ 0 . This work investigates the possibilities to measure the related reactions γ+n→K 0 +Λ and γ+n→K 0 +Σ 0 at a deuteron target and to measure the reaction γ+p→K 0 +Σ + at a proton target. For the first time the Σ + polarisation has been measured. With an cross section 10 times smaller compared to the kaon hyperon reactions, the photoproduction of the Φ(1020) meson can be investigated with the SAPHIR detector too. First reconstructed events are shown. (orig.)

  17. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave ...

    Indian Academy of Sciences (India)

    used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. ... All-sky radiance simulation; Megha tropiques; microwave SAPHIR sensor; radiative transfer; data ... versions of these non-linear processes (Ohring and.

  18. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  19. SAPhIR: a fission-fragment detector

    International Nuclear Information System (INIS)

    Theisen, Ch.; Gautherin, C.; Houry, M.; Korten, W.; Le Coz, Y.; Lucas, R.; Barreau, G.; Doan, T. P.; Belier, G.; Meot, V.; Ethvignot, Th.; Cahan, B.; Le Coguie, A.; Coppolani, X.; Delaitre, B.; Le Bourlout, P.; Legou, Ph.; Maillard, O.; Durand, G.; Bouillac, A.

    1998-01-01

    SAPhIR is the acronym for S a clay A q uitaine P ho tovoltaic cells for I s omer R e search. It consists of solar cells, used for fission-fragment detection. It is a collaboration between 3 laboratories: CEA Saclay, CENBG Bordeaux and CEA Bruyeres le Chatel. The coupling of a highly efficient fission-fragment detector like SAPhIR with EUROBALL will provide new insights in the study of very deformed nuclear matter and in the spectroscopy of neutron-rich nuclei

  20. Development of a model-independent evaluation of photon-deuteron reactions for the SAPHIR detector

    International Nuclear Information System (INIS)

    Wolf, A.

    1993-01-01

    The SAPHIR detector measures photon induced reactions with many particles in the final state. Thus a detailed investigation of those processes at photon energies between 0.4 and 3.3 GeV is possible. The interpretation of the distribution of the sample of events, which SAPHIR is able to reconstruct, has to be done after a correction of influences induced by the detector acceptance. In this work a model independent method of correcting and analysing the data is discussed. The implementation of the basic tools of this analysis is described and first tests with simulated and real events are performed. SAPHIR uses a time-of-flight system for the identification of particles. This work describes the structure of a program library, which supports an easy way of decoding the digitizations of this system (including calibration of the hardware) and obtaining the flight time for a particle in a event. The necessary step for calibrating the system are outlined, too. (orig.)

  1. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  2. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  3. Nucleonic calculations for possible irradiation experiments in SAPHIR

    International Nuclear Information System (INIS)

    Caro, M.; Pelloni, S.

    1990-01-01

    Accurate two-dimensional calculations show that a 'neutronic environment' exists in the SAPHIR reactor at the Paul Scherrer Institute (PSI) to simulate the inner surface of a given trepan of the Gundremmingen reactor. Neutron fluences and DPA rates were calculated at two positions in SAPHIR using the modern codes and nuclear data (from JEF-1). A particular region of the reactor can be found in which fluences and DPA rates agree well within a few percent with the Gundremmingen reference case. (author) 13 figs., 4 tabs., 18 refs

  4. Verification and validation of the SAPHIRE Version 4.0 PRA software package

    International Nuclear Information System (INIS)

    Bolander, T.W.; Calley, M.B.; Capps, E.L.

    1994-02-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE). SAPHIRE is a set of four computer programs that the Nuclear Regulatory Commission (NRC) developed to perform probabilistic risk assessments (PRAs). These programs allow an analyst to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs included in this set are Integrated Reliability and Risk Analysis System (IRRAS), System Analysis and Risk Assessment (SARA), Models and Results Database (MAR-D), and Fault Tree/Event Tree/Piping and Instrumentation Diagram (FEP) graphical editor. The V ampersand V steps included a V ampersand V plan to describe the process and criteria by which the V ampersand V would be performed; a software requirements documentation review to determine the correctness, completeness, and traceability of the requirements; a user survey to determine the usefulness of the user documentation, identification and testing of vital and non-vital features, and documentation of the test results

  5. Development of the software of the data taking system SOS for the SAPHIR experiment. Entwicklung der Software des Datennahmesystems SOS fuer das SAPHIR-Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.).

  6. The graphics system and the data saving for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Albold, D.

    1990-08-01

    Important extensions have been made to the data acquisition system SOS for the SAPHIR experiment at the Bonn ELSA facilities. As support for various online-programs, controlling components of the detector, a graphic system for presenting data was developed. This enables any program in the system to use all graphic devices. Main component is a program serving requests for presentation on a 19 inch color monitor. Window-technique allows a presentation of several graphics on one screen. Equipped with a trackball and using menus, this is an easy to use and powerful tool in controlling the experiment. Other important extensions concern data storage. A huge amount of event data can be stored on 8 mm cassettes by the program Eventsaver. This program can be controlled by a component of the SAPHIR-Online SOL running on a VAX-Computer and using windows and menus. The smaller amount of data, containing parameters and programs, which should be accessible within a small period of time, can be stored on a magnetic disk. A program supporting a file-structure for access to this disk is described. (orig./HSI) [de

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  8. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model; Simulation der Strahlfuehrung des SAPHIR-Experiments mittels eines Differentialgleichungsmodells

    Energy Technology Data Exchange (ETDEWEB)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.). [Deutsch] Diese Arbeit zeigt die numerische Simulation einer Strahlfuehrung mittels eines Differentialgleichungsmodells anhand der Strahlfuehrung vom Bonner ELSA-Beschleuniger zum SAPHIR-Experiment. Weiterhin wird eine Methode zur Gewinnung der Startwerte aus Strahlprofilmessungen diskutiert. (orig.).

  9. The scintillation counter system at the SAPHIR detector

    International Nuclear Information System (INIS)

    Bour, D.

    1989-10-01

    The scintillation-counters system of the SAPHIR-detector at the stretcher accelerator ELSA in Bonn consists of 64 counters. It supplies a fast hadronic trigger and is utilizised for the particle identification by time of flight measurements. Prototypes of the counters (340x21.25 x 6.0 cm 3 ) had been tested. The contribution to the resolution of the time of flight measurement was measured to σ=125 ps, the effective light velocity to 17.5 ns/cm and the attenuation length of 7.8 m. A pion kaon separation is possible up to a momentum of 1 GeV/c with time of flight measurement. With the first photon-beam at SAPHIR the counters were tested, first triggers were obtained and evaluated. (orig.) [de

  10. The trigger and data acquisition system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Honscheid, K.

    1988-10-01

    At present SAPHIR, a new experimental facility for medium energy physics is under construction at the Bonn electron accelerator ELSA (energy ≤ 3.5 GeV, duty cycle ≅ 100%). SAPHIR combines a large solid angle coverage with a tagging system and is therefore suited to investigate reactions with multi-particle final states. Structure and function of the multi-stage trigger system, which is used to select such processes, are described in this paper. With this system the trigger decision can be based on the number of charged particles as well as on the number of neutral particle detected. Several VMEbus modules have been developed, using memory look-up tables to make fast trigger decisions possible. In order to determine the number of neutral particles from the cluster distribution in the electromagnetic calorimeter some ideas of cellular had to be added. The system has a modular structure, so it can easily be extended. In the second part of this thesis the SAPHIR data acquisition system is discussed. It consists of a multiprocessor system with the VIP microcomputer as central element. The VIP is a VMEbus modul optimized for a multiprocessor environment. Its description as well as that of the other VMEbus boards developed for the SAPHIR online system can be found in this paper. As a basis for software development the operating system SOS is supplied. With SOS it is possible to write programs independent of the actual hardware configuration and so the complicated multiprocessor environment is hidden. To the user the system looks like a simple multi-tasking system. SOS is not restricted to the VIPs but can also be installed on computers of the VAX family, so that efficient mixed configurations are possible. The SAPHIR online system, based on the VIP microcomputer and the SOS operating system, is presented in the last part of this paper. This includes the read-out system, the monitoring of the different components etc. (orig./HSI) [de

  11. A new plant chamber facility PLUS coupled to the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2015-11-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been build and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees are mixed with synthetic air and are transferred to the SAPHIR chamber where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOC) can be studied in detail. In PLUS all important enviromental parameters (e.g. temperature, PAR, soil RH etc.) are well-controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leafes of the plants is constructed such that gases are exposed to FEP Teflon film and other Teflon surfaces only to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 LED panels which have an emission strength up to 800 μmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOC) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light and temperature dependent BVOC emissions are studied using six Quercus Ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus Ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental set up and the utility of the newly added plant chamber.

  12. The central drift chamber of the SAPHIR detector - implementation into the experiment and study of its properties

    International Nuclear Information System (INIS)

    Haas, K.M.

    1992-01-01

    At the Bonn accelerator facility ELSA the large solid angle detector SAPHIR was built for the investigation of photon induced reactions. A main component of SAPHIR is the central drift chamber (CDC) matching the magneto gap of 1m 3 . The diameter of the in total 1828 hexagonal drift cells is about 18 mm. The subject of this paper is the implementation of the CDC in the experiment. After the description of the hardware follows the presentation of the software tools for filtering and monitoring the data, which have been developed and tested. An algorithm for extracting the space time relationship is presented. The properties of the chamber with an improved gas mixture (Helium/Neon/Isobutane8 21.25:63.75:15) have been investigated. A spatial resolution of about 200 μm was achieved. The efficiency of the chamber is 97% at a tagged photon of 5x10 4 per second crossing the chamber. (orig.) [de

  13. The capabilities and applications of the saphire 5.0 safety assessment software

    International Nuclear Information System (INIS)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1994-01-01

    The System Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. The programs in this suite include: Models and Results Data Base (MAR-D) software, Integrated Reliability and Risk Analysis System (IRRAS) software, System Analysis and Risk Assessment (SARA) software, and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Each of these programs performs a specific function in taking a PRA from the conceptual state all the way to publication. This paper provides an overview of the features and capabilities provided in version 5.0 of this software system. Some major new features include the ability to store unlimited cut sets, the ability to perform location transformations, the ability to perform seismic analysis, the ability to perform automated rule based recovery analysis and end state cut set partitioning, the ability to perform end state analysis, a new alphanumeric fault tree editor, and a new alphanumeric event tree editor. Many enhancements and improvements to the user interface as well as a significant reduction in the time required to perform an analysis are included in version 5.0. These new features and capabilities provide a powerful set of PC based PRA analysis tools

  14. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume is the reference manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. The SARA database contains PRA data primarily for the dominant accident sequences of a family and descriptive information about the family including event trees, fault trees, and system model diagrams. The number of facility databases that can be accessed is limited only by the amount of disk storage available. To simulate changes to family systems, SARA users change the failure rates of initiating and basic events and/or modify the structure of the cut sets that make up the event trees, fault trees, and systems. The user then evaluates the effects of these changes through the recalculation of the resultant accident sequence probabilities and importance measures. The results are displayed in tables and graphs that may be printed for reports. A preliminary version of the SARA program was completed in August 1985 and has undergone several updates in response to user suggestions and to maintain compatibility with the other SAPHIRE programs. Version 5.0 of SARA provides the same capability as earlier versions and adds the ability to process unlimited cut sets; display fire, flood, and seismic data; and perform more powerful cut set editing

  15. Characterisation of the photolytic HONO-source in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    F. Rohrer

    2005-01-01

    Full Text Available HONO formation has been proposed as an important OH radical source in simulation chambers for more than two decades. Besides the heterogeneous HONO formation by the dark reaction of NO2 and adsorbed water, a photolytic source has been proposed to explain the elevated reactivity in simulation chamber experiments. However, the mechanism of the photolytic process is not well understood so far. As expected, production of HONO and NOx was also observed inside the new atmospheric simulation chamber SAPHIR under solar irradiation. This photolytic HONO and NOx formation was studied with a sensitive HONO instrument under reproducible controlled conditions at atmospheric concentrations of other trace gases. It is shown that the photolytic HONO source in the SAPHIR chamber is not caused by NO2 reactions and that it is the only direct NOy source under illuminated conditions. In addition, the photolysis of nitrate which was recently postulated for the observed photolytic HONO formation on snow, ground, and glass surfaces, can be excluded in the chamber. A photolytic HONO source at the surface of the chamber is proposed which is strongly dependent on humidity, on light intensity, and on temperature. An empirical function describes these dependencies and reproduces the observed HONO formation rates to within 10%. It is shown that the photolysis of HONO represents the dominant radical source in the SAPHIR chamber for typical tropospheric O3/H2O concentrations. For these conditions, the HONO concentrations inside SAPHIR are similar to recent observations in ambient air.

  16. Design and construction of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schwille, W.J.; Bockhorst, M.; Burbach, G.; Burgwinkel, R.; Empt, J.; Guse, B.; Haas, K.M.; Hannappel, J.; Heinloth, K.; Hey, T.; Honscheid, K.; Jahnen, T.; Jakob, H.P.; Joepen, N.; Juengst, H.; Kirch, U.; Klein, F.J.; Kostrewa, D.; Lindemann, L.; Link, J.; Manns, J.; Menze, D.; Merkel, H.; Merkel, R.; Neuerburg, W.; Paul, E.; Ploetzke, R.; Schenk, U.; Schmidt, S.; Scholmann, J.; Schuetz, P.; Schultz-Coulon, H.C.; Schweitzer, M.; Tran, M.Q.; Vogl, W.; Wedemeyer, R.; Wehnes, F.; Wisskirchen, J.; Wolf, A.

    1994-01-01

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  17. Design and construction of the SAPHIR detector

    Energy Technology Data Exchange (ETDEWEB)

    Schwille, W.J. (Bonn Univ. (Germany). Physikalisches Inst.); Bockhorst, M. (Bonn Univ. (Germany). Physikalisches Inst.); Burbach, G. (Bonn Univ. (Germany). Physikalisches Inst.); Burgwinkel, R. (Bonn Univ. (Germany). Physikalisches Inst.); Empt, J. (Bonn Univ. (Germany). Physikalisches Inst.); Guse, B. (Bonn Univ. (Germany). Physikalisches Inst.); Haas, K.M. (Bonn Univ. (Germany). Physikalisches Inst.); Hannappel, J. (Bonn Univ. (Germany). Physikalisches Inst.); Heinloth, K. (Bonn Univ. (Germany). Physikalisches Inst.); Hey, T. (Bonn Univ. (Germany). Physikalisches Inst.); Honscheid, K. (Bonn Univ. (Germany). Physikalisches Inst.); Jahnen, T. (Bonn Univ. (Germany). Physikalisches Inst.); Jakob, H.P. (Bonn Univ. (Germany). Physikalisches Inst.); Joepen, N. (Bonn Univ. (Germany). Physikalisches Inst.); Juengst, H. (Bonn Univ. (Germany). Physikalisches Inst.); Kirch, U. (Bonn Univ. (Germany). Physikalisches Inst.); Klein, F.J. (Bonn Univ. (Germany). Physikalisches Inst.)

    1994-05-15

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  18. Development of the software of the data taking system SOS for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.)

  19. Independent Verification and Validation SAPHIRE Version 8 Final Report Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-04-01

    This report provides an evaluation of the SAPHIRE version 8 software product. SAPHIRE version 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach has been taken for the IV&V activities on each vital software object. IV&V and Software Quality Assurance (SQA) activities occur throughout the entire development life cycle and therefore, will be required through the full development of SAPHIRE version 8. Later phases of the software life cycle, the operation and maintenance phases, are not applicable in this effort since the IV&V is being done prior to releasing Version 8.

  20. The photon detection system of the SAPHIR spectrometer

    International Nuclear Information System (INIS)

    Joepen, N.

    1990-09-01

    Worldwide a new generation of Electron Accelerators with energies below 5 GeV and a high duty cycle up to 100% is being built or planned. The first machine of this kind is ELSA, the Electron Stretcher and Accelerator, at the Physics Institute of Bonn University. Due to the high duty cycle of ELSA, experiments with tagged photon beams and a large angular acceptance become possible. At present SAPHIR, a new magnetic detector, especially layed out to detect multi-particle final states with good accuracy, is going into operation. Besides a large arrangement of drift chambers, for a good momentum resolution, and a trigger- and time-of-flight counter system, for particle identification, one of the main features of SAPHIR is a good photon detection capability. This is accomplished by a large electromagnetic calorimeter consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For the calorimeter a brass-gas-sandwich detector was developed. Its signal wires are strung perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A sample of nine shower counter modules was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/(E*√E(GeV)) = 13.55 ± 0.6% for a single module was achieved. The incident angle of the electrons was varied between 0 and 45 degrees. No significant change of energy resolution and linearity was observed. Combining the information from wire and cathode signals a position resolution (E = 1 GeV:Φ=0deg → σ = 15 mm, Φ=45deg → σ x = 19 mm) was reached. The second part of this paper gives a description of the shower counter arrangement in the SAPHIR detector. It requires a sophisticated control and calibration system, whose details are presented. Further on some aspects of the calorimeter calibration procedure are discussed

  1. Implementation of the FASTBUS data-acquisition system in the readout of the SAPHIR detector

    International Nuclear Information System (INIS)

    Empt, J.

    1993-12-01

    The magnetic detector SAPHIR is layed out to detect multiparticle final states with good acuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter was built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. For data acquisition of a major part of this calorimeter a modular FASTBUS system is used. In this report the FASTBUS system and its installation in the SAPHIR Online Program are described. (orig.)

  2. Track finding and track reconstruction in the internal forward drift chamber of SAPHIR

    International Nuclear Information System (INIS)

    Umlauf, G.

    1993-03-01

    A track finding algorithm has been developed for the inner forward drift chamber of the SAPHIR detector (at ELSA in Bonn) using the Principal Components Analysis as a tool for interpolating track coordinates. The drift chamber consists of twelve planar layers with six different inclinations and is being operated in an inhomogenous magnetic field. The task of track finding is basicly split into a primary stage that defines track candidates without the use of drift-time information and a second stage that serves to verify the track candidate and to resolve the intrinsic left-right ambiguities of the drift chamber signals. Tracks with at most three missing signals can be found. (orig.) [de

  3. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 5.0: Data loading manual. Volume 10

    International Nuclear Information System (INIS)

    VanHorn, R.L.; Wolfram, L.M.; Fowler, R.D.; Beck, S.T.; Smith, C.L.

    1995-04-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) suite of programs can be used to organize and standardize in an electronic format information from probabilistic risk assessments or individual plant examinations. The Models and Results Database (MAR-D) program of the SAPHIRE suite serves as the repository for probabilistic risk assessment and individual plant examination data and information. This report demonstrates by examples the common electronic and manual methods used to load these types of data. It is not a stand alone document but references documents that contribute information relative to the data loading process. This document provides a more detailed discussion and instructions for using SAPHIRE 5.0 only when enough information on a specific topic is not provided by another available source

  4. The alarm system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schultz-Coulon, H.C.

    1993-06-01

    In order to obtain an effective control of the different detector components an alarm system was built and implemented into the data acquisition system of the SAPHIR experiment. It provides an easy way of indicating errors by either adequate library calls or an appropriate hardware signal, both leading to an active alarm. This allows to react directly to any error detected by one of the specific control systems. In addition for selected kinds of errors the data run can be stopped automatically. Concept and construction of this system are described and some examples for its application are given. (orig.)

  5. Construction and calibration studies of the SAPHIR scintillation counters

    International Nuclear Information System (INIS)

    Kostrewa, D.

    1988-03-01

    For the scintillation counter system of the SAPHIR detector at the stretcher ring ELSA in Bonn 50 time of flight counters and 12 trigger counters have been built. Each of them has two photomultipliers, one at each side. A laser calibration system with a pulsed nitrogen laser as central light source to monitor these photomultipliers has been optimized. It was used to adjust the photomultipliers and to test their long and short time instabilities. (orig.)

  6. Track recognition in the central drift chamber of the SAPHIR detector at ELSA and first reconstruction of real tracks

    International Nuclear Information System (INIS)

    Korn, P.

    1991-02-01

    The FORTRAN program for pattern recognition in the central drift chamber of SAPHIR has been modified in order to find tracks with more than one missing wire signal and has been optimized in resolving the left/right ambiguities. The second part of this report deals with the reconstruction of some real tracks (γ → e + e - ), which were measured with SAPHIR. The efficiency of the central drift chamber and the space-to-drift time-relation are discussed. (orig.)

  7. Development of a FASTBUS data acquisition system for the SAPHIR calorimeter

    International Nuclear Information System (INIS)

    Klein, F.J.

    1992-01-01

    Due to the high duty cycle of the new Electron Accelerator at the Physics Institute of Bonn University, ELSA, experiments with tagged photon beams and a large angular acceptance become possible. The new magnetic detector SAPHIR is layed out to detect multi-particle final states with good accuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter is built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A modified shower counter module, containing 20 μm thick signal wires, was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/E*√E(GeV) = 12.2±0.5% was achieved. For data acquisition a modular FASTBUS system was used, which will be installed in the SAPHIR Online Program. (orig.) [de

  8. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  9. TOPAS 2 - a high-resolution tagging system at the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Rappenecker, G.

    1989-02-01

    For the SAPHIR-arrangement in Bonn a high resolving tagging system has been developed achieving an energy resolution of 2 MeV, covering the range of (0.94-0.34) E 0 photon energy (1.0 GeV 0 2 , ArCH 4 and ArC 2 H 6 in concern of performance, clustersize and coincidence width. (orig.)

  10. A new plant chamber facility, PLUS, coupled to the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2016-03-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been built and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow-through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees is mixed with synthetic air and transferred to the SAPHIR chamber, where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOCs) can be studied in detail. In PLUS all important environmental parameters (e.g., temperature, photosynthetically active radiation (PAR), soil relative humidity (RH)) are well controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leaves of the plants is constructed such that gases are exposed to only fluorinated ethylene propylene (FEP) Teflon film and other Teflon surfaces to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 light-emitting diode (LED) panels, which have an emission strength up to 800 µmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOCs) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light- and temperature- dependent BVOC emissions are studied using six Quercus ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental setup and the utility of the newly added plant chamber.

  11. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  12. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model

    International Nuclear Information System (INIS)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.) [de

  13. Contribution to the microwave characterisation of superconductive materials by means of sapphire resonators; Contribution a la caracterisation hyperfrequence de materiaux supraconducteurs par des resonateurs-saphirs

    Energy Technology Data Exchange (ETDEWEB)

    Hanus, Xavier

    1993-12-06

    The objective of this research thesis is to find a compact resonant structure which would allow the residual surface impedance of superconductive samples to be simply, quickly and economically characterised. The author first explains why he decided to use a sapphire single-crystal as inner dielectric, given some performance reached by resonant structures equipped with such inner dielectrics, and given constraints adopted from the start. He explains the origin of microwave losses which appear in this type of resonant structure, i.e. respectively the surface impedance as far as metallic losses are concerned, and the sapphire dielectric loss angle for as far as dielectric losses are concerned. The experimental installation and the principle of microwave measurements are described. The performance of different possible solutions of resonant structures from starting criteria is presented. The solution of the cavity-sapphire with a TE{sub 011} resonant mode is derived [French] Le but de cette etude est de trouver une structure resonnante compacte permettant de caracteriser simplement, rapidement et economiquement l'impedance de surface residuelle d'echantillons supraconducteurs. Les contraintes de mise en oeuvre et les performances atteintes par des resonateurs avec saphirs synthetiques justifient le choix d'un tel dielectrique a faible angle de perte. L'evaluation des performances experimentales appuyee par des modelesanalytiques permet de rejeter differentes solutions. Ainsi les resonateurs fermes avec saphirs minces sont rejetes en raison des mauvais contacts metalliques. Les resonateurs ouverts avec saphirs minces et epais sont egalement rejetes, meme pour les modes de resonance en principe confines, en raison des pertes par rayonnement. La seule solution est donc d'utiliser une cavite-saphir TE{sub 011} qui offre une configuration de champs naturellement confines. Des mesures sur une premiere cavite en niobium massif ont permis de selectionner un saphir obtenu par

  14. The muon trigger of the SAPHIR shower detector

    International Nuclear Information System (INIS)

    Rufeger-Hurek, H.

    1989-12-01

    The muon trigger system of the SAPHIR shower counter consists of 4 scintillation counters. The total trigger rate of cosmic muons is about 55 Hz which is reduced to about 45 Hz by the selecting algorithms. This rate of clean muon events allows a simultaneous monitoring of the whole electronics system and the calibration of the gas sandwich detector by measuring the gas gain. The dependences of the signals on the geometry have been simulated with the help of a Monte Carlo program. The comparison of simulated and measured pulse heights shows that faults in the electronics as well as defects in the detector hardware, e.g., the HV system, or temperature effects, can be recognized at the level of a few percent. In addition the muon signals are used to determine the calibration factor for each cathode channel individually. (orig.) [de

  15. First measurement of the reactions γp→K+Λ and γp→K+Σ0 with SAPHIR at ELSA

    International Nuclear Information System (INIS)

    Lindemann, L.

    1993-04-01

    This report can be subdivided into two main parts. The first part concerns the reconstruction program which has been developed to analyse the data taken with the large solid angle detector SAPHIR which is in operation at the Bonn electron accelerator facility ELSA. A survey on this program is given and some improvements as well as the efficiency concerning real data are discussed. The subject of the second part concerns the measurements of the reactions γp→K + Λand γp→K + Σ 0 . The analysis of a sample of data taken with the SAPHIR in June 1992 is discussed in detail. As a result of this analysis total and differential cross sections as well as the recoil polarization for the two processes are presented. In particular the first measurement of the Σ 0 polarization in photoproduction can be reported. (orig.)

  16. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0, technical reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Galyean, W.J.; Sattison, M.B.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume provides information on the principles used in the construction and operation of Version 5.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  17. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.; Smith, C.L.; Rasmuson, D.M.

    1994-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  18. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    1995-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  19. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Models and Results Database (MAR-D) reference manual. Volume 8

    International Nuclear Information System (INIS)

    Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions

  20. Particle identification by time-of-flight measurement in the SAPHIR

    International Nuclear Information System (INIS)

    Hoffmann-Rothe, P.

    1993-02-01

    Using photoproduction data which have been measured with the SAPHIR-detector with different target materials (C H 2 solid , H 2 liquid , D 2 liquid ) a detailed investigation and discussion of the detectors performance to measure the time of flight of charged particles and to separate between particles of different mass has been accomplished. A FORTRAN program has been written which provides a calibration of the scintillator panels of the TOF hodoscopes, calculates correction factors for the time-walk effect an finally, by combining the time of flight with track momentum measurement, determines particle masses. The current configuration of the detector makes it possible to separate between proton and pion up to a particle momentum of 1.6 GeV/c. Proton and kaon can be separated up to a momentum of 1.3 GeV/c, kaon and pion up to a momentum of 0.85 GeV/c. (prog.) [de

  1. Development and verification of a leningrad NPP unit 1 living PSA model in the INL SAPHIRE code format for prompt operational safety level monitoring

    International Nuclear Information System (INIS)

    Bronislav, Vinnikov

    2007-01-01

    The first part of the paper presents results of the work, that was carried out in complete conformity with the Technical Assignment, which was developed by the Leningrad Nuclear Power Plant. The initial scientific and technical information, contained into the In-Depth Safety Assessment Reports, was given to the author of the work. This information included graphical Fault Trees of Safety Systems and Auxiliary Technical Systems, Event Trees for the necessary number of Initial Events, and also information about failure probabilities of basic components of the nuclear unit. On the basis of this information and fueling it to the Usa Idaho National Laboratory (INL) SAPHIRE code, we have developed an electronic version of the Data Base for failure probabilities of the components of technical systems. Then, we have developed both the electronic versions of the necessary Fault Trees, and an electronic versions of the necessary Event Trees. And at last, we have carried out the linkage of the Event Trees. This work has resulted in the Living PSA (LPSA - Living Probabilistic Safety Assessment) Model of the Leningrad NPP Unit 1. The LPSA-model is completely adapted to be consistent with the USA INL SAPHIRE Risk Monitor. The second part of the paper results in analysis of fire consequences in various places of Leningrad NPP Unit 1. The computations were carried out with the help of the LPSA-model, developed in SAPHIRE code format. On the basis of the computations the order of priority of implementation of fire prevention measures was established. (author)

  2. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0. Volume 5, Systems Analysis and Risk Assessment (SARA) tutorial manual

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs) primarily for nuclear power plants. This volume is the tutorial manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons is provided that guides the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis capabilities

  3. Structural Analysis of PTM Hotspots (SAPH-ire) – A Quantitative Informatics Method Enabling the Discovery of Novel Regulatory Elements in Protein Families*

    Science.gov (United States)

    Dewhurst, Henry M.; Choudhury, Shilpa; Torres, Matthew P.

    2015-01-01

    Predicting the biological function potential of post-translational modifications (PTMs) is becoming increasingly important in light of the exponential increase in available PTM data from high-throughput proteomics. We developed structural analysis of PTM hotspots (SAPH-ire)—a quantitative PTM ranking method that integrates experimental PTM observations, sequence conservation, protein structure, and interaction data to allow rank order comparisons within or between protein families. Here, we applied SAPH-ire to the study of PTMs in diverse G protein families, a conserved and ubiquitous class of proteins essential for maintenance of intracellular structure (tubulins) and signal transduction (large and small Ras-like G proteins). A total of 1728 experimentally verified PTMs from eight unique G protein families were clustered into 451 unique hotspots, 51 of which have a known and cited biological function or response. Using customized software, the hotspots were analyzed in the context of 598 unique protein structures. By comparing distributions of hotspots with known versus unknown function, we show that SAPH-ire analysis is predictive for PTM biological function. Notably, SAPH-ire revealed high-ranking hotspots for which a functional impact has not yet been determined, including phosphorylation hotspots in the N-terminal tails of G protein gamma subunits—conserved protein structures never before reported as regulators of G protein coupled receptor signaling. To validate this prediction we used the yeast model system for G protein coupled receptor signaling, revealing that gamma subunit–N-terminal tail phosphorylation is activated in response to G protein coupled receptor stimulation and regulates protein stability in vivo. These results demonstrate the utility of integrating protein structural and sequence features into PTM prioritization schemes that can improve the analysis and functional power of modification-specific proteomics data. PMID:26070665

  4. TOPAS 1 - construction and test of a scintillation counter hodoscope for the tagging of bremsstrahlung photons for the SAPHIR detector

    International Nuclear Information System (INIS)

    Merkel, R.

    1989-09-01

    The development of a tagging-hodoscope for the SAPHIR-detector at the stretcher ring ELSA in Bonn is described. The hodoscope covers the energy range 2.175 GeV γ 0 =3.500 GeV. 24 scintillation counters are used for the determination of the photon energy, giving a resolution of ΔE γ =25 MeV. The tagging method requires a good coincidence timing resoluting τ between the tagging hodoscope and the detector for the photon-induced reactions in order to keep the accidental coincidences low. The timing information is given by 8 fast timing counters (40 mm thick), covering 5 up to 7 energy channels each. Fluctuations of the timing signal which result from different impact-locations on the timing counter, due to different light travelling distances, are corrected by the energy defining counters. The timing-component (8 timing counters) is commpleted and tested. The results of first mesurements show an upper limit of σ=250 psec for the resolution of 7 coincidences out of 45 possible channels in the tagging hodscope. These results are obtained with a preliminary adjustment of the SAPHIR beam-line and with a not yet optimized signal to noize ratio in the extracted beam. We hope to obtain a σ<200 psec under optimized conditions. (orig.)

  5. Measurement of the reaction {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} at SAPHIR and investigation of the decay angular distribution of the {Delta}{sup ++}(1232) resonance; Messung der Reaktion {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} an SAPHIR und Untersuchung der Zerfallswinkelverteilung der {Delta}{sup ++}(1232)-Resonanz

    Energy Technology Data Exchange (ETDEWEB)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E{sub {gamma}} = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the {Delta}{sup ++}(1232) in the reaction {gamma}d {yields} n{Delta}{sup ++}{pi}{sup -} are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [Deutsch] Im Rahmen dieser Arbeit ist der Aufbau eines Fluessiggas-Targets beschrieben worden, das speziell fuer den Einsatz im SAPHIR-Detektor entwickelt worden ist. Es wurden Funktionen zur Ueberwachung der Targetzelle vorgestellt und ein Sicherheitssystem zum Schutz der zentralen Driftkammer, die das Target unmittelbar umgibt. Weiterhin ist in Simulationsrechnungen untersucht worden, welchen Einfluss die Konstruktion des Targetstreutopfes auf die Messung unterschiedlicher Reaktionen haben kann. Dabei sind bei 50% bis 70% der Ereignisse Treffer in den Aluminiumstuetzen des Targetstreutopfes aufgetreten. Diese starke Beeintraechtigung kann durch eine Neukonstruktion des Streutopfes und der Verwendung von z.B. Rohazell als Streutopffenster vermieden werden. Rohazell zeichnet sich durch eine hohe Festigkeit und grosse Strahlungslaenge aus. An der Neukonstruktion des Streutopfes wird z.Z. gearbeitet. Der zweite Teil der Arbeit beschreibt eine der ersten

  6. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2010-01-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytical methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBB\\-CEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO3, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  7. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) Version 5.0. Fault tree, event tree, and piping ampersand instrumentation diagram (FEP) editors reference manual: Volume 7

    International Nuclear Information System (INIS)

    McKay, M.K.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, and event trees, and piping and instrumentation diagrams (P and IDs). The software is designed to enable the independent use of the graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is comprised of three separate editors (Fault Tree, Event Tree, and Piping and Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen guide of the entire FEP System

  8. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  9. Simulation calculations on the construction of the energy-tagged photon beam as well as development and test of the side drift chambers of the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Jahnen, T.

    1990-01-01

    The SAPHIR-detector is built up at the continuous photon beam of the Electron Stretcher and Accelerator ELSA in Bonn. The equipment is designed for investigations of reactions with more then two particles in the final state and for photon energies up to 3.5 GeV. A tagging-system determines the energy of the Bremsstrahlung-photons and a set-up of five large driftchambers measures the tracks of the charged particles. This work describes a program which was used to develop the best design of the tagging-hodoscope. In a second part the tests of the planar side-chambers and their evaluation is described. These measurements were carried out to fix the gasfilling and the parameters of the best working point. It is shown, that the chambers can reach a resolution of σ≤200 μm. (orig.) [de

  10. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2012-07-01

    Full Text Available During recent field campaigns, hydroxyl radical (OH concentrations that were measured by laser-induced fluorescence (LIF were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK, methacrolein (MACR and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD, China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS. Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s−1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03 × 106 cm−3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30–40% (median larger than those by DOAS after MVK (20 ppbv and

  11. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  12. Measurement of the reaction {gamma}p{yields}K{sup 0}{sigma}{sup +} for photon energies up to 2.65 GeV with the SAPHIR detector at ELSA; Messung der Reaktion {gamma}p {yields} K{sup 0}{sigma}{sup +} fuer Photonenergien bis 2.65 GeV mit dem SAPHIR-Detektor an ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Lawall, R.

    2004-01-01

    The reaction {gamma}p {yields} K{sup 0}{sigma}{sup +} was measured with the SAPHIR-detector at ELSA during the run periods 1997 and 1998. Results were obtained for cross sections in the photon energy range from threshold up to 2.65 GeV for all production angles and for the {sigma}{sup +}-polarization. Emphasis has been put on the determination and reduction of the contributions of background reactions and the comparison with other measurements and theoretical predictions. (orig.)

  13. MCM generator: a Java-based tool for generating medical metadata.

    Science.gov (United States)

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.

  14. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    International Nuclear Information System (INIS)

    Faghihi, F.; Ramezani, E.; Yousefpour, F.; Mirvakili, S.M.

    2008-01-01

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation

  15. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    Energy Technology Data Exchange (ETDEWEB)

    Faghihi, F. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Research Center for Radiation Protection, Shiraz University, Shiraz (Iran, Islamic Republic of); Nuclear Safety Research Center, Shiraz University, Shiraz (Iran, Islamic Republic of)], E-mail: faghihif@shirazu.ac.ir; Ramezani, E. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Yousefpour, F. [Atomic Energy Organization of Iran (AEOI), Tehran (Iran, Islamic Republic of); Mirvakili, S.M. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of)

    2008-10-15

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation.

  16. Measurement of the reaction γd →pnπ+π- at SAPHIR and investigation of the decay angular distribution of the Δ++(1232) resonance

    International Nuclear Information System (INIS)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E γ = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the Δ ++ (1232) in the reaction γd → nΔ ++ π - are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [de

  17. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  18. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  19. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    Science.gov (United States)

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  20. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  1. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  2. Validating the WHO maternal near miss tool: comparing high- and low-resource settings.

    Science.gov (United States)

    Witteveen, Tom; Bezstarosti, Hans; de Koning, Ilona; Nelissen, Ellen; Bloemenkamp, Kitty W; van Roosmalen, Jos; van den Akker, Thomas

    2017-06-19

    WHO proposed the WHO Maternal Near Miss (MNM) tool, classifying women according to several (potentially) life-threatening conditions, to monitor and improve quality of obstetric care. The objective of this study is to analyse merged data of one high- and two low-resource settings where this tool was applied and test whether the tool may be suitable for comparing severe maternal outcome (SMO) between these settings. Using three cohort studies that included SMO cases, during two-year time frames in the Netherlands, Tanzania and Malawi we reassessed all SMO cases (as defined by the original studies) with the WHO MNM tool (five disease-, four intervention- and seven organ dysfunction-based criteria). Main outcome measures were prevalence of MNM criteria and case fatality rates (CFR). A total of 3172 women were studied; 2538 (80.0%) from the Netherlands, 248 (7.8%) from Tanzania and 386 (12.2%) from Malawi. Total SMO detection was 2767 (87.2%) for disease-based criteria, 2504 (78.9%) for intervention-based criteria and 1211 (38.2%) for organ dysfunction-based criteria. Including every woman who received ≥1 unit of blood in low-resource settings as life-threatening, as defined by organ dysfunction criteria, led to more equally distributed populations. In one third of all Dutch and Malawian maternal death cases, organ dysfunction criteria could not be identified from medical records. Applying solely organ dysfunction-based criteria may lead to underreporting of SMO. Therefore, a tool based on defining MNM only upon establishing organ failure is of limited use for comparing settings with varying resources. In low-resource settings, lowering the threshold of transfused units of blood leads to a higher detection rate of MNM. We recommend refined disease-based criteria, accompanied by a limited set of intervention- and organ dysfunction-based criteria to set a measure of severity.

  3. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  4. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Directory of Open Access Journals (Sweden)

    A. W. Rollins

    2012-11-01

    Full Text Available N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focused on the investigation of potential interferences from, e.g., water vapour or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS instruments, 2 laser-induced fluorescence (LIF instruments. Datasets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values were larger than 0.96 for the entire datasets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference

  5. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  6. Total OH reactivity study from VOC photochemical oxidation in the SAPHIR chamber

    Science.gov (United States)

    Yu, Z.; Tillmann, R.; Hohaus, T.; Fuchs, H.; Novelli, A.; Wegener, R.; Kaminski, M.; Schmitt, S. H.; Wahner, A.; Kiendler-Scharr, A.

    2015-12-01

    It is well known that hydroxyl radicals (OH) act as a dominant reactive species in the degradation of VOCs in the atmosphere. In recent field studies, directly measured total OH reactivity often showed poor agreement with OH reactivity calculated from VOC measurements (e.g. Nölscher et al., 2013; Lu et al., 2012a). This "missing OH reactivity" is attributed to unaccounted biogenic VOC emissions and/or oxidation products. The comparison of total OH reactivity being directly measured and calculated from single component measurements of VOCs and their oxidation products gives us a further understanding on the source of unmeasured reactive species in the atmosphere. This allows also the determination of the magnitude of the contribution of primary VOC emissions and their oxidation products to the missing OH reactivity. A series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, to explore in detail the photochemical degradation of VOCs (isoprene, ß-pinene, limonene, and D6-benzene) by OH. The total OH reactivity was determined from the measurement of VOCs and their oxidation products by a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF-MS) with a GC/MS/FID system, and directly measured by a laser-induced fluorescence (LIF) at the same time. The comparison between these two total OH reactivity measurements showed an increase of missing OH reactivity in the presence of oxidation products of VOCs, indicating a strong contribution to missing OH reactivity from uncharacterized oxidation products.

  7. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  8. Tools and approaches for simplifying serious games development in educational settings

    OpenAIRE

    Calvo, Antonio; Rotaru, Dan C.; Freire, Manuel; Fernandez-Manjon, Baltasar

    2016-01-01

    Serious Games can benefit from the commercial video games industry by taking advantage of current development tools. However, the economics and requirements of serious games and commercial games are very different. In this paper, we describe the factors that impact the total cost of ownership of serious games used in educational settings, review the specific requirements of games used as learning material, and analyze the different development tools available in the industry highlighting thei...

  9. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  10. Evaluating online diagnostic decision support tools for the clinical setting.

    Science.gov (United States)

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  11. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  12. Am I getting an accurate picture: a tool to assess clinical handover in remote settings?

    Directory of Open Access Journals (Sweden)

    Malcolm Moore

    2017-11-01

    Full Text Available Abstract Background Good clinical handover is critical to safe medical care. Little research has investigated handover in rural settings. In a remote setting where nurses and medical students give telephone handover to an aeromedical retrieval service, we developed a tool by which the receiving clinician might assess the handover; and investigated factors impacting on the reliability and validity of that assessment. Methods Researchers consulted with clinicians to develop an assessment tool, based on the ISBAR handover framework, combining validity evidence and the existing literature. The tool was applied ‘live’ by receiving clinicians and from recorded handovers by academic assessors. The tool’s performance was analysed using generalisability theory. Receiving clinicians and assessors provided feedback. Results Reliability for assessing a call was good (G = 0.73 with 4 assessments. The scale had a single factor structure with good internal consistency (Cronbach’s alpha = 0.8. The group mean for the global score for nurses and students was 2.30 (SD 0.85 out of a maximum 3.0, with no difference between these sub-groups. Conclusions We have developed and evaluated a tool to assess high-stakes handover in a remote setting. It showed good reliability and was easy for working clinicians to use. Further investigation and use is warranted beyond this setting.

  13. Instructor's Perceptions towards the Use of an Online Instructional Tool in an Academic English Setting in Kuwait

    Science.gov (United States)

    Erguvan, Deniz

    2014-01-01

    This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…

  14. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    Science.gov (United States)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  15. A set of tools for determining the LAT performance in specific applications

    International Nuclear Information System (INIS)

    Lott, B.; Ballet, J.; Chiang, J.; Lonjou, V.; Funk, S.

    2007-01-01

    The poster presents a set of simple tools being developed to predict GLAST's performance for specific cases, like the accumulation time needed to reach a given significance or statistical accuracy for a particular source. Different examples are given, like the generation of a full-sky sensitivity map

  16. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455). PMID:25520777

  17. Older adult mistreatment risk screening: contribution to the validation of a screening tool in a domestic setting.

    Science.gov (United States)

    Lindenbach, Jeannette M; Larocque, Sylvie; Lavoie, Anne-Marise; Garceau, Marie-Luce

    2012-06-01

    ABSTRACTThe hidden nature of older adult mistreatment renders its detection in the domestic setting particularly challenging. A validated screening instrument that can provide a systematic assessment of risk factors can facilitate this detection. One such instrument, the "expanded Indicators of Abuse" tool, has been previously validated in the Hebrew language in a hospital setting. The present study has contributed to the validation of the "e-IOA" in an English-speaking community setting in Ontario, Canada. It consisted of two phases: (a) a content validity review and adaptation of the instrument by experts throughout Ontario, and (b) an inter-rater reliability assessment by home visiting nurses. The adaptation, the "Mistreatment of Older Adult Risk Factors" tool, offers a comprehensive tool for screening in the home setting. This instrument is significant to professional practice as practitioners working with older adults will be better equipped to assess for risk of mistreatment.

  18. Comparison of OH Reactivity Instruments in the Atmosphere Simulation Chamber SAPHIR.

    Science.gov (United States)

    Fuchs, H.; Novelli, A.; Rolletter, M.; Hofzumahaus, A.; Pfannerstill, E.; Edtbauer, A.; Kessel, S.; Williams, J.; Michoud, V.; Dusanter, S.; Locoge, N.; Zannoni, N.; Gros, V.; Truong, F.; Sarda Esteve, R.; Cryer, D. R.; Brumby, C.; Whalley, L.; Stone, D. J.; Seakins, P. W.; Heard, D. E.; Schoemaecker, C.; Blocquet, M.; Fittschen, C. M.; Thames, A. B.; Coudert, S.; Brune, W. H.; Batut, S.; Tatum Ernest, C.; Harder, H.; Elste, T.; Bohn, B.; Hohaus, T.; Holland, F.; Muller, J. B. A.; Li, X.; Rohrer, F.; Kubistin, D.; Kiendler-Scharr, A.; Tillmann, R.; Andres, S.; Wegener, R.; Yu, Z.; Zou, Q.; Wahner, A.

    2017-12-01

    Two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016 to compare hydroxyl (OH) radical reactivity (kOH) measurements. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapor, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements is higher for instruments directly detecting hydroxyl radicals (OH), whereas the indirect Comparative Reactivity Method (CRM) has a higher limit of detection of 2s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapor or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected in the chamber to simulate urban and forested environments. Overall, the results show that instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to the reference were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds. In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM measurements is most likely limited by the corrections that need to be applied in order to account for known effects of, for example, deviations from pseudo-first order conditions, nitrogen oxides or water vapor on the measurement

  19. Improving beam set-up using an online beam optics tool

    International Nuclear Information System (INIS)

    Richter, S.; Barth, W.; Franczak, B.; Scheeler, U.; Wilms, D.

    2004-01-01

    The GSI accelerator facility [1] consists of the Universal Linear Accelerator (Unilac), the heavy ion synchrotron SIS, and the Experimental Storage Ring (ESR). Two Unilac injectors with three ion source terminals provide ion species from the lightest such as hydrogen up to uranium. The High Current Injector (HSI) for low charge state ion beams provides mostly high intense but short pulses, whereas the High Charge State Injector (HLI) supplies long pulses with a high duty factor of up to 27%. Before entering the Alvarez section of the Unilac the ion beam from the HSI is stripped in a supersonic gas jet. Up to three different ion species can be accelerated for up to five experiments in a time-sharing mode. Frequent changes of beam energy and intensity during a single beam time period may result in time consuming set-up and tuning especially of the beam transport lines. To shorten these changeover times an online optics tool (MIRKO EXPERT) had been developed. Based on online emittance measurements at well-defined locations the beam envelopes are calculated using the actual magnet settings. With this input improved calculated magnet settings can be directly sent to the magnet power supplies. The program reads profile grid measurements, such that an atomized beam alignment is established and that steering times are minimized. Experiences on this tool will be reported. At the Unilac a special focus is put on high current operation with short but intense beam pulses. Limitations like missing non-destructive beam diagnostics, insufficient longitudinal beam diagnostics, insufficient longitudinal beam matching, and influence of the hard edged model for magnetic fields will be discussed. Special attention will be put on the limits due to high current effects with bunched beams. (author)

  20. Mathematical tools for data mining set theory, partial orders, combinatorics

    CERN Document Server

    Simovici, Dan A

    2014-01-01

    Data mining essentially relies on several mathematical disciplines, many of which are presented in this second edition of this book. Topics include partially ordered sets, combinatorics, general topology, metric spaces, linear spaces, graph theory. To motivate the reader a significant number of applications of these mathematical tools are included ranging from association rules, clustering algorithms, classification, data constraints, logical data analysis, etc. The book is intended as a reference for researchers and graduate students. The current edition is a significant expansion of the firs

  1. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    Science.gov (United States)

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  2. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  3. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  4. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  5. Workplace wellness using online learning tools in a healthcare setting.

    Science.gov (United States)

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p online learning, indicating scope for development of further online packages relating to other important health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Validation of the TRUST tool in a Greek perioperative setting.

    Science.gov (United States)

    Chatzea, Vasiliki-Eirini; Sifaki-Pistolla, Dimitra; Dey, Nilanjan; Melidoniotis, Evangelos

    2017-06-01

    The aim of this study was to translate, culturally adapt and validate the TRUST questionnaire in a Greek perioperative setting. The TRUST questionnaire assesses the relationship between trust and performance. The study assessed the levels of trust and performance in the surgery and anaesthesiology department during a very stressful period for Greece (economic crisis) and offered a user friendly and robust assessment tool. The study concludes that the Greek version of the TRUST questionnaire is a reliable and valid instrument for measuring team performance among Greek perioperative teams. Copyright the Association for Perioperative Practice.

  7. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Novelli, Anna; Rolletter, Michael; Hofzumahaus, Andreas; Pfannerstill, Eva Y.; Kessel, Stephan; Edtbauer, Achim; Williams, Jonathan; Michoud, Vincent; Dusanter, Sebastien; Locoge, Nadine; Zannoni, Nora; Gros, Valerie; Truong, Francois; Sarda-Esteve, Roland; Cryer, Danny R.; Brumby, Charlotte A.; Whalley, Lisa K.; Stone, Daniel; Seakins, Paul W.; Heard, Dwayne E.; Schoemaecker, Coralie; Blocquet, Marion; Coudert, Sebastien; Batut, Sebastien; Fittschen, Christa; Thames, Alexander B.; Brune, William H.; Ernest, Cheryl; Harder, Hartwig; Muller, Jennifer B. A.; Elste, Thomas; Kubistin, Dagmar; Andres, Stefanie; Bohn, Birger; Hohaus, Thorsten; Holland, Frank; Li, Xin; Rohrer, Franz; Kiendler-Scharr, Astrid; Tillmann, Ralf; Wegener, Robert; Yu, Zhujun; Zou, Qi; Wahner, Andreas

    2017-10-01

    Hydroxyl (OH) radical reactivity (kOH) has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements (limit of detection CRM) has a higher limit of detection of 2 s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapour or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds (mixing ratio of OH reactants were up to 10 ppbv). In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM

  8. AORN Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings.

    Science.gov (United States)

    Hughes, Nancy L; Nelson, Audrey; Matz, Mary W; Lloyd, John

    2011-06-01

    Prolonged standing during surgical procedures poses a high risk of causing musculoskeletal disorders, including back, leg, and foot pain, which can be chronic or acute in nature. Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings provides recommendations for relieving the strain of prolonged standing, including the use of antifatigue mats, supportive footwear, and sit/stand stools, that are based on well-accepted ergonomic safety concepts, current research, and access to new and emerging technology. Published by Elsevier Inc.

  9. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    Science.gov (United States)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  10. Development of a multilevel health and safety climate survey tool within a mining setting.

    Science.gov (United States)

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  11. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  12. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  13. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  14. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Science.gov (United States)

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  15. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    Science.gov (United States)

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  16. Study of the photoproduction of the vector meson Φ(1020) and the hyperon Λ(1520) from the production threshold up to a photon energy of 2.65 GeV with SAPHIR

    International Nuclear Information System (INIS)

    Wiegers, B.

    2001-05-01

    The photoproduction of the vector meson φ(1020) and the hyperon Λ(1520) have been measured in the finale state pK + K - from their thresholds up to 2.65 GeV using the high duty-factor electron accelerator ELSA and the 4π-detectorsystem SAPHIR. The t-dependence of φ(1020)-production shows an exponential behavior as expected from diffractive production. s-channel helicity conservation can be seen in the decay angular distribution in the helicity frame. The decay angular distribution in the Gottfried-Jackson frame is not conformable with the exchange of a Pomeron in the t-channel. For the first time, differential cross sections of the Λ(1520) photoproduction from the threshold are measured. The production angular distribution and the decay angular distribution in the Gottfried-Jackson frame show a K * exchange in the t-channel. (orig.)

  17. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  18. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  19. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  20. Peac – A set of tools to quickly enable Proof on a cluster

    International Nuclear Information System (INIS)

    Ganis, G; Vala, M

    2012-01-01

    With advent of the analysis phase of Lhcdata-processing, interest in Proof technology has considerably increased. While setting up a simple Proof cluster for basic usage is reasonably straightforward, exploiting the several new functionalities added in recent times may be complicated. Peac, standing for Proof Enabled Analysis Cluster, is a set of tools aiming to facilitate the setup and management of a Proof cluster. Peac is based on the experience made by setting up Proof for the Alice analysis facilities. It allows to easily build and configure Root and the additional software needed on the cluster, and may serve as distributor of binaries via Xrootd. Peac uses Proof-On-Demand (PoD) for resource management (start, stop or daemons). Finally, Peac sets-up and configures dataset management (using the Afdsmgrd daemon), as well as cluster monitoring (machine status and Proof query summaries) using MonAlisa. In this respect, a MonAlisa page has been dedicated to Peac users, so that a cluster managed by Peac can be automatically monitored. In this paper we present and describe the status and main components of Peac and show details about its usage.

  1. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2017-10-01

    Full Text Available Hydroxyl (OH radical reactivity (kOH has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds by all instruments. The precision of the measurements (limit of detection  < 1 s−1 at a time resolution of 30 s to a few minutes is higher for instruments directly detecting hydroxyl radicals, whereas the indirect comparative reactivity method (CRM has a higher limit of detection of 2 s−1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO, water vapour or nitric oxide (NO. In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in

  2. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  3. Investigation of OH Radical Regeneration from Isoprene Oxidation Across Different NOx Regimes in the Atmosphere Simulation Chamber SAPHIR

    Science.gov (United States)

    Novelli, A.; Bohn, B.; Dorn, H. P.; Häseler, R.; Hofzumahaus, A.; Kaminski, M.; Yu, Z.; Li, X.; Tillmann, R.; Wegener, R.; Fuchs, H.; Kiendler-Scharr, A.; Wahner, A.

    2017-12-01

    The hydroxyl radical (OH) is the dominant daytime oxidant in the troposphere. It starts the degradation of volatile organic compounds (VOC) originating from both anthropogenic and biogenic emissions. Hence, it is a crucial trace species in model simulations as it has a large impact on many reactive trace gases. Many field campaigns performed in isoprene dominated environment in low NOx conditions have shown large discrepancies between the measured and the modelled OH radical concentrations. These results have contributed to the discovery of new regeneration paths for OH radicals from isoprene-OH second generation products with maximum efficiency at low NO. The current chemical models (e.g. MCM 3.3.1) include this novel chemistry allowing for an investigation of the validity of the OH regeneration at different chemical conditions. Over 11 experiments focusing on the OH oxidation of isoprene were performed at the SAPHIR chamber in the Forschungszentrum Jülich. Measurements of VOCs, NOx, O3, HONO were performed together with the measurement of OH radicals (by both LIF-FAGE and DOAS) and OH reactivity. Within the simulation chamber, the NO mixing ratio was varied between 0.05 to 2 ppbv allowing the investigation of both the "new" regeneration path for OH radicals and the well-known NO+HO2 mechanism. A comparison with the MCM 3.3.1 that includes the upgraded LIM1 mechanism showed very good agreement (within 10%) for the OH data at all concentrations of NOx investigated. Comparison with different models, without LIM1 and with updated rates for the OH regeneration, will be presented together with a detailed analysis of the impact of this study on results from previous field campaigns.

  4. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  5. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    LENUS (Irish Health Repository)

    Hennerby, Cathy

    2012-02-01

    AIM: This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. BACKGROUND: The increased number of registered general agency nurses working in an acute children\\'s hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about \\'near misses\\

  6. Use of a tool-set by Pan troglodytes troglodytes to obtain termites (Macrotermes) in the periphery of the Dja Biosphere Reserve, southeast Cameroon.

    Science.gov (United States)

    Deblauwe, Isra; Guislain, Patrick; Dupain, Jef; Van Elsacker, Linda

    2006-12-01

    At the northern periphery of the Dja Biosphere Reserve (southeastern Cameroon) we recorded a new use of a tool-set by Pan troglodytes troglodytes to prey on Macrotermes muelleri, M. renouxi, M. lilljeborgi, and M. nobilis. We recovered 79 puncturing sticks and 47 fishing probes at 17 termite nests between 2002 and 2005. The mean length of the puncturing sticks (n = 77) and fishing probes (n = 45) was 52 cm and 56 cm, respectively, and the mean diameter was 9 mm and 4.5 mm, respectively. Sixty-eight percent of 138 chimpanzee fecal samples contained major soldiers of four Macrotermes species. The chimpanzees in southeastern Cameroon appeared to be selective in their choice of plant material to make their tools. The tools found at our study site resemble those from other sites in this region. However, in southeastern Cameroon only one tool-set type was found, whereas two tool-set types have been reported in Congo. Our study suggests that, along with the different vegetation types and the availability of plant material around termite nests, the nest and gallery structure and foraging behavior of the different Macrotermes spp. at all Central African sites must be investigated before we can attribute differences in tool-use behavior to culture. (c) 2006 Wiley-Liss, Inc.

  7. Google Sets, Google Suggest, and Google Search History: Three More Tools for the Reference Librarian's Bag of Tricks

    OpenAIRE

    Cirasella, Jill

    2008-01-01

    This article examines the features, quirks, and uses of Google Sets, Google Suggest, and Google Search History and argues that these three lesser-known Google tools warrant inclusion in the resourceful reference librarian’s bag of tricks.

  8. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  9. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  10. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    Science.gov (United States)

    Hennerby, Cathy; Joyce, Pauline

    2011-03-01

    This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. The increased number of registered general agency nurses working in an acute children's hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about 'near misses', parental dissatisfaction, perceived competency weaknesses and rising cost associated with their use. [Young's (2009) Journal of Organisational Change, 22, 524-548] nine-stage change framework was used to guide the implementation of the competency assessment tool within a paediatric acute care setting. The ongoing success of the initiative, from a nurse manager's perspective, relies on structured communication with the agency provider before employing competent agency nurses. Sustainability of the change will depend on nurse managers' persistence in attending the concerns of those resisting the change while simultaneously supporting those championing the change. These key communication and supporting roles highlight the pivotal role held by nurse managers, as gate keepers, in safe-guarding children while in hospital. Leadership qualities of nurse managers will also be challenged in continuing to manage and drive the change where resistance might prevail. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  11. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  12. Investigation of the oxidation of methyl vinyl ketone (MVK) by OH radicals in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Albrecht, Sascha; Acir, Ismail-Hakki; Bohn, Birger; Breitenlechner, Martin; Dorn, Hans-Peter; Gkatzelis, Georgios I.; Hofzumahaus, Andreas; Holland, Frank; Kaminski, Martin; Keutsch, Frank N.; Novelli, Anna; Reimer, David; Rohrer, Franz; Tillmann, Ralf; Vereecken, Luc; Wegener, Robert; Zaytsev, Alexander; Kiendler-Scharr, Astrid; Wahner, Andreas

    2018-06-01

    The photooxidation of methyl vinyl ketone (MVK) was investigated in the atmospheric simulation chamber SAPHIR for conditions at which organic peroxy radicals (RO2) mainly reacted with NO (high NO case) and for conditions at which other reaction channels could compete (low NO case). Measurements of trace gas concentrations were compared to calculated concentration time series applying the Master Chemical Mechanism (MCM version 3.3.1). Product yields of methylglyoxal and glycolaldehyde were determined from measurements. For the high NO case, the methylglyoxal yield was (19 ± 3) % and the glycolaldehyde yield was (65 ± 14) %, consistent with recent literature studies. For the low NO case, the methylglyoxal yield reduced to (5 ± 2) % because other RO2 reaction channels that do not form methylglyoxal became important. Consistent with literature data, the glycolaldehyde yield of (37 ± 9) % determined in the experiment was not reduced as much as implemented in the MCM, suggesting additional reaction channels producing glycolaldehyde. At the same time, direct quantification of OH radicals in the experiments shows the need for an enhanced OH radical production at low NO conditions similar to previous studies investigating the oxidation of the parent VOC isoprene and methacrolein, the second major oxidation product of isoprene. For MVK the model-measurement discrepancy was up to a factor of 2. Product yields and OH observations were consistent with assumptions of additional RO2 plus HO2 reaction channels as proposed in literature for the major RO2 species formed from the reaction of MVK with OH. However, this study shows that also HO2 radical concentrations are underestimated by the model, suggesting that additional OH is not directly produced from RO2 radical reactions, but indirectly via increased HO2. Quantum chemical calculations show that HO2 could be produced from a fast 1,4-H shift of the second most important MVK derived RO2 species (reaction rate constant 0

  13. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  14. A Standardized Needs Assessment Tool to Inform the Curriculum Development Process for Pediatric Resuscitation Simulation-Based Education in Resource-Limited Settings

    Directory of Open Access Journals (Sweden)

    Nicole Shilkofski

    2018-02-01

    Full Text Available IntroductionUnder five mortality rates (UFMR remain high for children in low- and middle-income countries (LMICs in the developing world. Education for practitioners in these environments is a key factor to improve outcomes that will address United Nations Sustainable Development Goals 3 and 10 (good health and well being and reduced inequalities. In order to appropriately contextualize a curriculum using simulation, it is necessary to first conduct a needs assessment of the target learner population. The World Health Organization (WHO has published a tool to assess capacity for emergency and surgical care in LMICs that is adaptable to this goal.Materials and methodsThe WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was modified to assess pediatric resuscitation capacity in clinical settings in two LMICs: Uganda and Myanmar. Modifications included assessment of self-identified learning needs, current practices, and perceived epidemiology of disease burden in each clinical setting, in addition to assessment of pediatric resuscitation capacity in regard to infrastructure, procedures, equipment, and supplies. The modified tool was administered to 94 respondents from the two settings who were target learners of a proposed simulation-based curriculum in pediatric and neonatal resuscitation.ResultsInfectious diseases (respiratory illnesses and diarrheal disease were cited as the most common causes of pediatric deaths in both countries. Self-identified learning needs included knowledge and skill development in pediatric airway/breathing topics, as well as general resuscitation topics such as CPR and fluid resuscitation in shock. Equipment and supply availability varied substantially between settings, and critical shortages were identified in each setting. Current practices and procedures were often limited by equipment availability or infrastructural considerations.Discussion and conclusionEpidemiology of disease

  15. Idea: an integrated set of tools for sustainable nuclear decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, M.; Centner, B.; Vanderperre, S.; Wacquier, W.

    2008-01-01

    Decommissioning of nuclear installations constitutes an important challenge and shall prove to the public that the whole nuclear life cycle is fully mastered by the nuclear industry. This could lead to an easier public acceptance of the construction of new nuclear power plants. When ceasing operation, nuclear installations owners and operators are looking for solutions in order to assess and keep decommissioning costs at a reasonable level, to fully characterise waste streams (in particular radiological inventories of difficult-to-measure radionuclides) and to reduce personnel exposure during the decommissioning activities taking into account several project, site and country specific constraints. In response to this need, Tractebel Engineering has developed IDEA (Integrated DEcommissioning Application), an integrated set of computer tools, to support the engineering activities to be carried out in the frame of a decommissioning project. IDEA provides optimized solutions from an economical, environmental, social and safety perspective. (authors)

  16. Piloting a programme tool to evaluate malaria case investigation and reactive case detection activities: results from 3 settings in the Asia Pacific.

    Science.gov (United States)

    Cotter, Chris; Sudathip, Prayuth; Herdiana, Herdiana; Cao, Yuanyuan; Liu, Yaobao; Luo, Alex; Ranasinghe, Neil; Bennett, Adam; Cao, Jun; Gosling, Roly D

    2017-08-22

    Case investigation and reactive case detection (RACD) activities are widely-used in low transmission settings to determine the suspected origin of infection and identify and treat malaria infections nearby to the index patient household. Case investigation and RACD activities are time and resource intensive, include methodologies that vary across eliminating settings, and have no standardized metrics or tools available to monitor and evaluate them. In response to this gap, a simple programme tool was developed for monitoring and evaluating (M&E) RACD activities and piloted by national malaria programmes. During the development phase, four modules of the RACD M&E tool were created to assess and evaluate key case investigation and RACD activities and costs. A pilot phase was then carried out by programme implementers between 2013 and 2015, during which malaria surveillance teams in three different settings (China, Indonesia, Thailand) piloted the tool over a period of 3 months each. This study describes summary results of the pilots and feasibility and impact of the tool on programmes. All three study areas implemented the RACD M&E tool modules, and pilot users reported the tool and evaluation process were helpful to identify gaps in RACD programme activities. In the 45 health facilities evaluated, 71.8% (97/135; min 35.3-max 100.0%) of the proper notification and reporting forms and 20.0% (27/135; min 0.0-max 100.0%) of standard operating procedures (SOPs) were available to support malaria elimination activities. The tool highlighted gaps in reporting key data indicators on the completeness for malaria case reporting (98.8%; min 93.3-max 100.0%), case investigations (65.6%; min 61.8-max 78.4%) and RACD activities (70.0%; min 64.7-max 100.0%). Evaluation of the SOPs showed that knowledge and practices of malaria personnel varied within and between study areas. Average monthly costs for conducting case investigation and RACD activities showed variation between study

  17. First records of tool-set use for ant-dipping by Eastern chimpanzees (Pan troglodytes schweinfurthii) in the Kalinzu Forest Reserve, Uganda.

    Science.gov (United States)

    Hashimoto, Chie; Isaji, Mina; Koops, Kathelijne; Furuichi, Takeshi

    2015-10-01

    Chimpanzees at numerous study sites are known to prey on army ants by using a single wand to dip into the ant nest or column. However, in Goualougo (Republic of Congo) in Central Africa, chimpanzees use a different technique, use of a woody sapling to perforate the ant nest, then use of a herb stem as dipping tool to harvest the army ants. Use of a tool set has also been found in Guinea, West Africa: at Seringbara in the Nimba Mountains and at nearby Bossou. There are, however, no reports for chimpanzees in East Africa. We observed use of such a tool set in Kalinzu, Uganda, for the first time by Eastern chimpanzees. This behavior was observed among one group of chimpanzees at Kalinzu (S-group) but not among the adjacent group (M-group) with partly overlapping ranging areas despite the fact that the latter group has been under intensive observation since 1997. In Uganda, ant-dipping has not been observed in the northern three sites (Budongo, Semliki, and Kibale) but has been observed or seems to occur in the southern sites (Kalinzu and Bwindi), which suggests that ant-dipping was invented by and spread from the southern region after the northern and southern forest blocks became separated. Use of a tool-set by only one group at Kalinzu further suggests that this behavior was recently invented and has not yet spread to the other group via migrating females.

  18. Measurement of the reactions γp→K+Λ and γp→K+Σ0 for photon energies up to 2.6 GeV with the SAPHIR detector at ELSA

    International Nuclear Information System (INIS)

    Glander, K.H.

    2003-02-01

    The reactions γp→K + Lambda and γp→K + Σ 0 were measured in the energy range from threshold up to a photon energy of 2.6 GeV. The data were taken with the SAPHIR detector at the electron stretcher facility ELSA. Results on cross sections and hyperon polarizations are presented as a function of kaon production angle and photon energy. The total cross section for Λ production shows a strong treshold enhancement wehreas the Σ 0 data have a maximum at about E γ =1.45 GeV. Cross sections together with their angular decompositions into Legendre polynomials suggest contributions from resonance production for both reactions. The K + Λ differential cross section is enhanced for backward produced kaons at E γ ∼1.45 GeV. This might be interpreted as contribution of a so called missing resonance D 13 (1895). In general, the induced polarization of Λ has negative values in the kaon forward direction and positive values in the backward direction. The magnitude varies with energy. The polarization of Σ 0 follows a similar angular and energy dependence as that of Λ, but with opposite sign. (orig.)

  19. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    OpenAIRE

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducte...

  20. Blended Learning Tools in Geosciences: A New Set of Online Tools to Help Students Master Skills

    Science.gov (United States)

    Cull, S.; Spohrer, J.; Natarajan, S.; Chin, M.

    2013-12-01

    In most geoscience courses, students are expected to develop specific skills. To master these skills, students need to practice them repeatedly. Unfortunately, few geosciences courses have enough class time to allow students sufficient in-class practice, nor enough instructor attention and time to provide fast feedback. To address this, we have developed an online tool called an Instant Feedback Practice (IFP). IFPs are low-risk, high-frequency exercises that allow students to practice skills repeatedly throughout a semester, both in class and at home. After class, students log onto a course management system (like Moodle or Blackboard), and click on that day's IFP exercise. The exercise might be visually identifying a set of minerals that they're practicing. After answering each question, the IFP tells them if they got it right or wrong. If they got it wrong, they try again until they get it right. There is no penalty - students receive the full score for finishing. The goal is low-stakes practice. By completing dozens of these practices throughout the semester, students have many, many opportunities to practice mineral identification with quick feedback. Students can also complete IFPs during class in groups and teams, with in-lab hand samples or specimens. IFPs can also be used to gauge student skill levels as the semester progresses, as they can be set up to provide the instructor feedback on specific skills or students. When IFPs were developed for and implemented in a majors-level mineralogy class, students reported that in-class and online IFPs were by far the most useful technique they used to master mineral hand sample identification. Final grades in the course were significantly higher than historical norms, supporting students' anecdotal assessment of the impact of IFPs on their learning.

  1. Soft sets combined with interval valued intuitionistic fuzzy sets of type-2 and rough sets

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2015-03-01

    Full Text Available Fuzzy set theory, rough set theory and soft set theory are all mathematical tools dealing with uncertainties. The concept of type-2 fuzzy sets was introduced by Zadeh in 1975 which was extended to interval valued intuitionistic fuzzy sets of type-2 by the authors.This paper is devoted to the discussions of the combinations of interval valued intuitionistic sets of type-2, soft sets and rough sets.Three different types of new hybrid models, namely-interval valued intuitionistic fuzzy soft sets of type-2, soft rough interval valued intuitionistic fuzzy sets of type-2 and soft interval valued intuitionistic fuzzy rough sets of type-2 are proposed and their properties are derived.

  2. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  3. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Science.gov (United States)

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton-pump inhibitors. Our tool provided insights that confirmed/complemented information gained from randomized-clinical trials. Prospective tool implementation is warranted.

  4. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    Directory of Open Access Journals (Sweden)

    Yu-Chi Lin

    2011-02-01

    Full Text Available Due to the implicit characteristics of learning disabilities (LDs, the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN and support vector machine (SVM have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST, which can not only perform as a classifier, but may also produce meaningful explanations or rules, to the LD diagnosis application. Our experiments indicate that the RST approach is competitive as a tool for feature selection, and it performs better in term of prediction accuracy than other rulebased algorithms such as decision tree and ripper algorithms. We also propose to mix samples collected from sources with different LD diagnosis procedure and criteria. By pre-processing these mixed samples with simple and readily available clustering algorithms, we are able to improve the quality and support of rules generated by the RST. Overall, our study shows that the rough set approach, as a classification and knowledge discovery tool, may have great potential in playing an essential role in LD diagnosis.

  5. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    Science.gov (United States)

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized

  6. Reliability and validity of a novel tool to comprehensively assess food and beverage marketing in recreational sport settings.

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Mâsse, Louise C; Storey, Kate; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Current methods for evaluating food marketing to children often study a single marketing channel or approach. As the World Health Organization urges the removal of unhealthy food marketing in children's settings, methods that comprehensively explore the exposure and power of food marketing within a setting from multiple marketing channels and approaches are needed. The purpose of this study was to test the inter-rater reliability and the validity of a novel settings-based food marketing audit tool. The Food and beverage Marketing Assessment Tool for Settings (FoodMATS) was developed and its psychometric properties evaluated in five public recreation and sport facilities (sites) and subsequently used in 51 sites across Canada for a cross-sectional analysis of food marketing. Raters recorded the count of food marketing occasions, presence of child-targeted and sports-related marketing techniques, and the physical size of marketing occasions. Marketing occasions were classified by healthfulness. Inter-rater reliability was tested using Cohen's kappa (κ) and intra-class correlations (ICC). FoodMATS scores for each site were calculated using an algorithm that represented the theoretical impact of the marketing environment on food preferences, purchases, and consumption. Higher FoodMATS scores represented sites with higher exposure to, and more powerful (unhealthy, child-targeted, sports-related, large) food marketing. Validity of the scoring algorithm was tested through (1) Pearson's correlations between FoodMATS scores and facility sponsorship dollars, and (2) sequential multiple regression for predicting "Least Healthy" food sales from FoodMATS scores. Inter-rater reliability was very good to excellent (κ = 0.88-1.00, p marketing in recreation facilities, the FoodMATS provides a novel means to comprehensively track changes in food marketing environments that can assist in developing and monitoring the impact of policies and interventions.

  7. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  8. Mixed methods evaluation of a quality improvement and audit tool for nurse-to-nurse bedside clinical handover in ward settings.

    Science.gov (United States)

    Redley, Bernice; Waugh, Rachael

    2018-04-01

    Nurse bedside handover quality is influenced by complex interactions related to the content, processes used and the work environment. Audit tools are seldom tested in 'real' settings. Examine the reliability, validity and usability of a quality improvement tool for audit of nurse bedside handover. Naturalistic, descriptive, mixed-methods. Six inpatient wards at a single large not-for-profit private health service in Victoria, Australia. Five nurse experts and 104 nurses involved in 199 change-of-shift bedside handovers. A focus group with experts and pilot test were used to examine content and face validity, and usability of the handover audit tool. The tool was examined for inter-rater reliability and usability using observation audits of handovers across six wards. Data were collected in 2013-2014. Two independent observers for 72 audits demonstrated acceptable inter-observer agreement for 27 (77%) items. Reliability was weak for items examining the handover environment. Seventeen items were not observed reflecting gaps in practices. Across 199 observation audits, gaps in nurse bedside handover practice most often related to process and environment, rather than content items. Usability was impacted by high observer burden, familiarity and non-specific illustrative behaviours. The reliability and validity of most items to audit handover content was acceptable. Gaps in practices for process and environment items were identified. Context specific exemplars and reducing the items used at each handover audit can enhance usability. Further research is needed to develop context specific exemplars and undertake additional reliability testing using a wide range of handover settings. CONTRIBUTION OF THE PAPER. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    Directory of Open Access Journals (Sweden)

    Sandeep R Panta

    2016-03-01

    Full Text Available In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS. We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3 JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g. brain volume or voxel values from segmented gray matter images were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples including 1 quality control measures calculated from phantom data over time, 2 quality control data from human functional MRI data across various studies, scanners, sites, 3 volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1 and (2 show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e. data sets with poor QC, clustering of data by site quickly. Results from (3 demonstrate

  10. The Distraction in Action Tool©: Feasibility and Usability in Clinical Settings.

    Science.gov (United States)

    Hanrahan, Kirsten; Kleiber, Charmaine; Miller, Ben J; Davis, Heather; McCarthy, Ann Marie

    2017-11-10

    Distraction is a relatively simple, evidence-based intervention to minimize child distress during medical procedures. Timely on-site interventions that instruct parents on distraction coaching are needed. The purpose of this study was to test the feasibility and usability of the Distraction in Action Tool© (DAT©), which 1) predicts child risk for distress with a needle stick and 2) provides individualized instructions for parents on how to be a distraction coach for their child in clinical settings. A mixed-methods descriptive design was used to test feasibility and usability of DAT in the Emergency Department and a Phlebotomy Lab at a large Midwest Academic Medical Center. Twenty parents of children ages 4-10years requiring venipuncture and clinicians performing 13 of those procedures participated. Participants completed an evaluation and participated in a brief interview. The average age of the children was 6.8years, and 80% of parent participants were mothers. Most parents reported the DAT was not difficult to use (84.2%), understandable (100%), and they had a positive experience (89.5%). Clinicians thought DAT was helpful (100%) and did not cause a meaningful delay in workflow (92%). DAT can be used by parents and clinicians to assess their children's risk for procedure related distress and learn distraction techniques to help their children during needle stick procedures. DAT for parents is being disseminated via social media and an open-access website. Further research is needed to disseminate and implement DAT in community healthcare settings. Copyright © 2017. Published by Elsevier Inc.

  11. SAPHIRE technical reference manual: IRRAS/SARA Version 4.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Sattison, M.B.; Rasmuson, D.M.

    1993-01-01

    This report provides information on the principles used in the construction and operation of Version 4.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. The report then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. The report gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  12. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    Science.gov (United States)

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  13. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  14. Chemical analysis of particulate and gaseous products from the monoterpene oxidation in the SAPHIR chamber during the EUCAARI campaign 2008

    Science.gov (United States)

    Kahnt, A.; Iinuma, Y.; Herrmann, H.; Mentel, T. F.; Fisseha, R.; Kiendler-Scharr, A.

    2009-04-01

    The atmospheric oxidation of monoterpenes leads to multifunctional products with lower vapour pressure. These products condense and coagulate to existing particles leading to particle formation and growth. In order to obtain better insights into the mechanisms and the importance of sources to organic aerosol, a mixture of monoterpenes was oxidised in the SAPHIR outdoor chamber during the EUCAARI campaign in 2008. The mixture was made of α-pinene, β-pinene, limonene, 3-carene and ocimene, representing a typical monoterpene emission from a boreal forest. In addition, two sesquiterpenes (α-farnesene and caryophyllene) were reacted together with the monoterpene mixture in some experiments. The VOC (volatile organic compound) mixture was reacted under tropospheric oxidation and light conditions in a prolonged time scale over two days. In the present study, a special emphasis is put on the detection of carbonyl compounds from the off-line analysis of collected filter and denuder samples from the campaign in 2008. The oxidation products which contain carbonyl groups are important first stable intermediates during the monoterpene and sesquiterpene oxidation. They react further with atmospheric oxidants to form lower volatile acidic compounds, contributing to secondary organic aerosol (SOA). Commonly used methods for the analysis of carbonyl compounds involve derivatisation steps prior to separation and subsequent UV or MS detection. In the present study, 2,4-dinitrophenylhydrazine (DNPH) was used to derivatise the extracted filter and denuder samples. The DNPH converts aldehyde- and keto-groups to stable hydrazones, which can be purified afterwards using a solid phase extraction (SPE) cartridge. The derivatised samples were analysed with HPLC/ESI-TOFMS which allowed us to determine the exact chemical formula of unknown products. In addition to known carbonyl compounds from monoterpene oxidation such as pinonaldehyde and nopinon, previously unreported molecular masses

  15. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, pcare desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between

  16. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  17. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    Science.gov (United States)

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. Double π production on the deuteron with the energy-tagged photon beam of the spectrometer facility for photon-induced reactions

    International Nuclear Information System (INIS)

    Merkel, R.

    1992-11-01

    Within the framework of this thesis it has been achieved to complete the tagging system TOPAS 1 including all aspects of hardware, software and calibration procedures. In addition, TOPAS 1, has been integrated into SAPHIR successfully, thus adding an indispensable tool for making physical measurements. Initial data analysis of the double Pion production at the Deuteron proved the basic function and usability of the tagging system in measuring total cross sections, also comprising their dependence on photon energy. (orig.) [de

  20. Set-Pi: Set Membership pi-Calculus

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Mödersheim, Sebastian Alexander; Nielson, Flemming

    2015-01-01

    Communication protocols often rely on stateful mechanisms to ensure certain security properties. For example, counters and timestamps can be used to ensure authentication, or the security of communication can depend on whether a particular key is registered to a server or it has been revoked. Pro......Verif, like other state of the art tools for protocol analysis, achieves good performance by converting a formal protocol specification into a set of Horn clauses, that represent a monotonically growing set of facts that a Dolev-Yao attacker can derive from the system. Since this set of facts is not state...... method with three examples, a simple authentication protocol based on counters, a key registration protocol, and a model of the Yubikey security device....

  1. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  2. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  3. Reducing Information Overload in Large Seismic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  4. APMS: An Integrated Set of Tools for Measuring Safety

    Science.gov (United States)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through

  5. Photovoice as a Pedagogical Tool: Exploring Personal and Professional Values with Female Muslim Social Work Students in an Intercultural Classroom Setting

    Science.gov (United States)

    Bromfield, Nicole F.; Capous-Desyllas, Moshoula

    2017-01-01

    This article explores a classroom project in which we used photovoice as a pedagogical tool to enhance personal and professional self-awareness among female, Muslim, social work students in an intercultural classroom setting located in the Arabian Gulf. We begin with an overview and discussion of arts-based approaches to education and then provide…

  6. Breastfeeding assessment tools

    International Nuclear Information System (INIS)

    Bizouerne, Cécile; Kerac, Marko; Macgrath, Marie

    2014-01-01

    Full text: Breastfeeding plays a major role in reducing the global burden of child mortality and under-nutrition. Whilst many programmes aim to support breastfeeding and prevent feeding problems occurring, interventions are also needed once they have developed. In this situation, accurate assessment of a problem is critical to inform prognosis and enables tailored, appropriate treatment. The presentation will present a review, which aims to identify breastfeeding assessment tools/checklists for use in assessing malnourished infants in poor resource settings. The literature review identified 24 breastfeeding assessment tools, and 41 validation studies. Evidence underpinning most of the tools was mainly low quality, and conducted in high-income countries and hospital settings. The presentation will describe the main findings of the literature review and propose recommendations for improving existing tools in order to appropriately assess malnourished infants and enable early, appropriate intervention and treatment of malnutrition. (author)

  7. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Directory of Open Access Journals (Sweden)

    Nava Siegelmann-Danieli

    Full Text Available Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes.The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP, FP plus oxaliplatin (FP-O, or FP plus irinotecan (FP-I in the first-line between 9/2006 and 12/2013.The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group. For the entire cohort, median overall survival (OS was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older. Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9-24.0] vs. 18.9 [15.5-21.9] months. Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant. Cox analysis (controlling for age and gender identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton

  8. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  9. Separate tools or tool kits: an exploratory study of engineers' preferences

    NARCIS (Netherlands)

    Vliegen, Ingrid; Kleingeld, P.A.M.; van Houtum, Geert-Jan

    2010-01-01

    This paper describes an exploratory study of aspects that should be taken into consideration when optimizing tool kits, i.e. cases containing sets of tools that are used by service engineers for corrective maintenance. The study was carried out among service engineers of an Original Equipment

  10. De-MetaST-BLAST: a tool for the validation of degenerate primer sets and data mining of publicly available metagenomes.

    Directory of Open Access Journals (Sweden)

    Christopher A Gulvik

    Full Text Available Development and use of primer sets to amplify nucleic acid sequences of interest is fundamental to studies spanning many life science disciplines. As such, the validation of primer sets is essential. Several computer programs have been created to aid in the initial selection of primer sequences that may or may not require multiple nucleotide combinations (i.e., degeneracies. Conversely, validation of primer specificity has remained largely unchanged for several decades, and there are currently few available programs that allows for an evaluation of primers containing degenerate nucleotide bases. To alleviate this gap, we developed the program De-MetaST that performs an in silico amplification using user defined nucleotide sequence dataset(s and primer sequences that may contain degenerate bases. The program returns an output file that contains the in silico amplicons. When De-MetaST is paired with NCBI's BLAST (De-MetaST-BLAST, the program also returns the top 10 nr NCBI database hits for each recovered in silico amplicon. While the original motivation for development of this search tool was degenerate primer validation using the wealth of nucleotide sequences available in environmental metagenome and metatranscriptome databases, this search tool has potential utility in many data mining applications.

  11. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    International Nuclear Information System (INIS)

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-01

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  12. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  13. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  14. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  15. Selection of the optimal set of revenue management tools in hotels

    OpenAIRE

    Korzh, Nataliia; Onyshchuk, Natalia

    2017-01-01

    The object of research is the scientific category «revenue management» and its tools, which, with the growth of the number of on-line sales channels of hotel services, become decisive in the struggle for survival. The existence of a large number of profit management tools associated with the online booking regime work as a SmallDat and gives quite scattered information about the state of the market. One of the most problematic areas is the formation of perspective analytics using existing too...

  16. TAM 2.0: tool for MicroRNA set analysis.

    Science.gov (United States)

    Li, Jianwei; Han, Xiaofen; Wan, Yanping; Zhang, Shan; Zhao, Yingshu; Fan, Rui; Cui, Qinghua; Zhou, Yuan

    2018-06-06

    With the rapid accumulation of high-throughput microRNA (miRNA) expression profile, the up-to-date resource for analyzing the functional and disease associations of miRNAs is increasingly demanded. We here describe the updated server TAM 2.0 for miRNA set enrichment analysis. Through manual curation of over 9000 papers, a more than two-fold growth of reference miRNA sets has been achieved in comparison with previous TAM, which covers 9945 and 1584 newly collected miRNA-disease and miRNA-function associations, respectively. Moreover, TAM 2.0 allows users not only to test the functional and disease annotations of miRNAs by overrepresentation analysis, but also to compare the input de-regulated miRNAs with those de-regulated in other disease conditions via correlation analysis. Finally, the functions for miRNA set query and result visualization are also enabled in the TAM 2.0 server to facilitate the community. The TAM 2.0 web server is freely accessible at http://www.scse.hebut.edu.cn/tam/ or http://www.lirmed.com/tam2/.

  17. Validation of the Australian Midwifery Standards Assessment Tool (AMSAT): A tool to assess midwifery competence.

    Science.gov (United States)

    Sweet, Linda; Bazargan, Maryam; McKellar, Lois; Gray, Joanne; Henderson, Amanda

    2018-02-01

    There is no current validated clinical assessment tool to measure the attainment of midwifery student competence in the midwifery practice setting. The lack of a valid assessment tool has led to a proliferation of tools and inconsistency in assessment of, and feedback on student learning. This research aimed to develop and validate a tool to assess competence of midwifery students in practice-based settings. A mixed-methods approach was used and the study implemented in two phases. Phase one involved the development of the AMSAT tool with qualitative feedback from midwifery academics, midwife assessors of students, and midwifery students. In phase two the newly developed AMSAT tool was piloted across a range of midwifery practice settings and ANOVA was used to compare scores across year levels, with feedback being obtained from assessors. Analysis of 150 AMSAT forms indicate the AMSAT as: reliable (Cronbach alpha greater than 0.9); valid-data extraction loaded predominantly onto one factor; and sensitivity scores indicating level of proficiency increased across the three years. Feedback evaluation forms (n=83) suggest acceptance of this tool for the purpose of both assessing and providing feedback on midwifery student's practice performance and competence. The AMSAT is a valid, reliable and acceptable midwifery assessment tool enables consistent assessment of midwifery student competence. This assists benchmarking across midwifery education programs. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  18. Validation of Nurse Practitioner Primary Care Organizational Climate Questionnaire: A New Tool to Study Nurse Practitioner Practice Settings.

    Science.gov (United States)

    Poghosyan, Lusine; Chaplin, William F; Shaffer, Jonathan A

    2017-04-01

    Favorable organizational climate in primary care settings is necessary to expand the nurse practitioner (NP) workforce and promote their practice. Only one NP-specific tool, the Nurse Practitioner Primary Care Organizational Climate Questionnaire (NP-PCOCQ), measures NP organizational climate. We confirmed NP-PCOCQ's factor structure and established its predictive validity. A crosssectional survey design was used to collect data from 314 NPs in Massachusetts in 2012. Confirmatory factor analysis and regression models were used. The 4-factor model characterized NP-PCOCQ. The NP-PCOCQ score predicted job satisfaction (beta = .36; p organizational climate in their clinics. Further testing of NP-PCOCQ is needed.

  19. Measuring social exclusion in healthcare settings: a scoping review.

    Science.gov (United States)

    O'Donnell, Patrick; O'Donovan, Diarmuid; Elmusharaf, Khalifa

    2018-02-02

    Social exclusion is a concept that has been widely debated in recent years; a particular focus of the discussion has been its significance in relation to health. The meanings of the phrase "social exclusion", and the closely associated term "social inclusion", are contested in the literature. Both of these concepts are important in relation to health and the area of primary healthcare in particular. Thus, several tools for the measurement of social exclusion or social inclusion status in health care settings have been developed. A scoping review of the peer-reviewed and grey literature was conducted to examine tools developed since 2000 that measure social exclusion or social inclusion. We focused on those measurement tools developed for use with individual patients in healthcare settings. Efforts were made to obtain a copy of each of the original tools, and all relevant background literature. All tools retrieved were compared in tables, and the specific domains that were included in each measure were tabulated. Twenty-two measurement tools were included in the final scoping review. The majority of these had been specifically developed for the measurement of social inclusion or social exclusion, but a small number were created for the measurement of other closely aligned concepts. The majority of the tools included were constructed for engaging with patients in mental health settings. The tools varied greatly in their design, the scoring systems and the ways they were administered. The domains covered by these tools varied widely and some of the tools were quite narrow in the areas of focus. A review of the definitions of both social inclusion and social exclusion also revealed the variations among the explanations of these complex concepts. There are several definitions of both social inclusion and social exclusion in use and they differ greatly in scope. While there are many tools that have been developed for measuring these concepts in healthcare settings, these

  20. Using the Lives Saved Tool (LiST) to Model mHealth Impact on Neonatal Survival in Resource-Limited Settings

    Science.gov (United States)

    Jo, Youngji; Labrique, Alain B.; Lefevre, Amnesty E.; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K.

    2014-01-01

    While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)–an evidence-based modeling software–to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives. PMID:25014008

  1. Using the lives saved tool (LiST to model mHealth impact on neonatal survival in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Youngji Jo

    Full Text Available While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST--an evidence-based modeling software--to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives.

  2. Volume Sculpting Using the Level-Set Method

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2002-01-01

    In this paper, we propose the use of the Level--Set Method as the underlying technology of a volume sculpting system. The main motivation is that this leads to a very generic technique for deformation of volumetric solids. In addition, our method preserves a distance field volume representation....... A scaling window is used to adapt the Level--Set Method to local deformations and to allow the user to control the intensity of the tool. Level--Set based tools have been implemented in an interactive sculpting system, and we show sculptures created using the system....

  3. The e-Reader — an Educational or an Entertainment Tool? e-Readers in an Academic Setting

    Directory of Open Access Journals (Sweden)

    Peter Ahlroos

    2012-01-01

    Full Text Available In this paper the authors will discuss a pilot project conducted at the Tritonia Academic Library, Vaasa, in Finland, from September 2010 until May 2011. The project was designed to investigate the application of e-readers in academic settings and to learn how teachers and students experience the use of e-readers in academic education. Four groups of students and one group of teachers used Kindle readers for varied periods of time in different courses. The course material and the textbooks were downloaded on the e-readers. The feedback from the participants was collected through questionnaires and teacher interviews. The results suggest that the e-reader is a future tool for learning, though some features need to be improved before e-readers can really enable efficient learning and researching.

  4. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2010-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  5. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2011-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  6. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  7. Non-commutative tools for topological insulators

    International Nuclear Information System (INIS)

    Prodan, Emil

    2010-01-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  8. Hydraulic release oil tool

    International Nuclear Information System (INIS)

    Mims, M.G.; Mueller, M.D.; Ehlinger, J.C.

    1992-01-01

    This patent describes a hydraulic release tool. It comprises a setting assembly; a coupling member for coupling to drill string or petroleum production components, the coupling member being a plurality of sockets for receiving the dogs in the extended position and attaching the coupling member the setting assembly; whereby the setting assembly couples to the coupling member by engagement of the dogs in the sockets of releases from and disengages the coupling member in movement of the piston from its setting to its reposition in response to a pressure in the body in exceeding the predetermined pressure; and a relief port from outside the body into its bore and means to prevent communication between the relief port and the bore of the body axially of the piston when the piston is in the setting position and to establish such communication upon movement of the piston from the setting position to the release position and reduce the pressure in the body bore axially of the piston, whereby the reduction of the pressure signals that the tool has released the coupling member

  9. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  10. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  11. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in't Veld, M.A.A.; Boogaard, S.A.A. van den

    2008-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation - meetings - by means of a set

  12. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in 't Veld, M. M.A.; Boogaard, S.A.A. van den

    2007-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation -meetings- by means of a set of

  13. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  14. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    Science.gov (United States)

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  15. HEDIS Limited Data Set

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Healthcare Effectiveness Data and Information Set (HEDIS) is a tool used by more than 90 percent of Americas health plans to measure performance on important...

  16. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  17. Development of the Quality Improvement Minimum Quality Criteria Set (QI-MQCS): a tool for critical appraisal of quality improvement intervention publications.

    Science.gov (United States)

    Hempel, Susanne; Shekelle, Paul G; Liu, Jodi L; Sherwood Danz, Margie; Foy, Robbie; Lim, Yee-Wei; Motala, Aneesa; Rubenstein, Lisa V

    2015-12-01

    Valid, reliable critical appraisal tools advance quality improvement (QI) intervention impacts by helping stakeholders identify higher quality studies. QI approaches are diverse and differ from clinical interventions. Widely used critical appraisal instruments do not take unique QI features into account and existing QI tools (eg, Standards for QI Reporting Excellence) are intended for publication guidance rather than critical appraisal. This study developed and psychometrically tested a critical appraisal instrument, the QI Minimum Quality Criteria Set (QI-MQCS) for assessing QI-specific features of QI publications. Approaches to developing the tool and ensuring validity included a literature review, in-person and online survey expert panel input, and application to empirical examples. We investigated psychometric properties in a set of diverse QI publications (N=54) by analysing reliability measures and item endorsement rates and explored sources of disagreement between reviewers. The QI-MQCS includes 16 content domains to evaluate QI intervention publications: Organisational Motivation, Intervention Rationale, Intervention Description, Organisational Characteristics, Implementation, Study Design, Comparator Description, Data Sources, Timing, Adherence/Fidelity, Health Outcomes, Organisational Readiness, Penetration/Reach, Sustainability, Spread and Limitations. Median inter-rater agreement for QI-MQCS items was κ 0.57 (83% agreement). Item statistics indicated sufficient ability to differentiate between publications (median quality criteria met 67%). Internal consistency measures indicated coherence without excessive conceptual overlap (absolute mean interitem correlation=0.19). The critical appraisal instrument is accompanied by a user manual detailing What to consider, Where to look and How to rate. We developed a ready-to-use, valid and reliable critical appraisal instrument applicable to healthcare QI intervention publications, but recognise scope for

  18. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Science.gov (United States)

    Rostami, Paryaneh; Ashcroft, Darren M; Tully, Mary P

    2018-01-01

    Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives. Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of

  19. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Directory of Open Access Journals (Sweden)

    Paryaneh Rostami

    Full Text Available Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives.Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory.Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported.Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however

  20. A strategy to improve priority setting in developing countries.

    Science.gov (United States)

    Kapiriri, Lydia; Martin, Douglas K

    2007-09-01

    Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. Priority setting in developing countries is fraught with uncertainty due to lack of credible information, weak priority setting institutions, and unclear priority setting processes. Efforts to improve priority setting in these contexts have focused on providing information and tools. In this paper we argue that priority setting is a value laden and political process, and although important, the available information and tools are not sufficient to address the priority setting challenges in developing countries. Additional complementary efforts are required. Hence, a strategy to improve priority setting in developing countries should also include: (i) capturing current priority setting practices, (ii) improving the legitimacy and capacity of institutions that set priorities, and (iii) developing fair priority setting processes.

  1. Computability and Representations of the Zero Set

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2008-01-01

    htmlabstractIn this note we give a new representation for closed sets under which the robust zero set of a function is computable. We call this representation the component cover representation. The computation of the zero set is based on topological index theory, the most powerful tool for finding

  2. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    Science.gov (United States)

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived

  3. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  4. Spectral dimension in causal set quantum gravity

    International Nuclear Information System (INIS)

    Eichhorn, Astrid; Mizera, Sebastian

    2014-01-01

    We evaluate the spectral dimension in causal set quantum gravity by simulating random walks on causal sets. In contrast to other approaches to quantum gravity, we find an increasing spectral dimension at small scales. This observation can be connected to the nonlocality of causal set theory that is deeply rooted in its fundamentally Lorentzian nature. Based on its large-scale behaviour, we conjecture that the spectral dimension can serve as a tool to distinguish causal sets that approximate manifolds from those that do not. As a new tool to probe quantum spacetime in different quantum gravity approaches, we introduce a novel dimensional estimator, the causal spectral dimension, based on the meeting probability of two random walkers, which respect the causal structure of the quantum spacetime. We discuss a causal-set example, where the spectral dimension and the causal spectral dimension differ, due to the existence of a preferred foliation. (paper)

  5. Protein-energy malnutrition in the rehabilitation setting: Evidence to improve identification.

    Science.gov (United States)

    Marshall, Skye

    2016-04-01

    Methods of identifying malnutrition in the rehabilitation setting require further examination so that patient outcomes may be improved. The purpose of this narrative review was to: (1) examine the defining characteristics of malnutrition, starvation, sarcopenia and cachexia; (2) review the validity of nutrition screening tools and nutrition assessment tools in the rehabilitation setting; and (3) determine the prevalence of malnutrition in the rehabilitation setting by geographical region and method of diagnosis. A narrative review was conducted drawing upon international literature. Starvation represents one form of malnutrition. Inadequate energy and protein intake are the critical factor in the aetiology of malnutrition, which is distinct from sarcopenia and cachexia. Eight nutrition screening tools and two nutrition assessment tools have been evaluated for criterion validity in the rehabilitation setting, and consideration must be given to the resources of the facility and the patient group in order to select the appropriate tool. The prevalence of malnutrition in the rehabilitation setting ranges from 14-65% worldwide with the highest prevalence reported in rural, European and Australian settings. Malnutrition is highly prevalent in the rehabilitation setting, and consideration must be given to the patient group when determining the most appropriate method of identification so that resources may be used efficaciously and the chance of misdiagnosis minimised. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Electronic Mail in Academic Settings: A Multipurpose Communications Tool.

    Science.gov (United States)

    D'Souza, Patricia Veasey

    1992-01-01

    Explores possible uses of electronic mail in three areas of the academic setting: instruction, research, and administration. Electronic mail is defined, the components needed to get started with electronic mail are discussed, and uses and benefits of electronic mail in diverse educational environments are suggested. (12 references) (DB)

  7. Food marketing in recreational sport settings in Canada: a cross-sectional audit in different policy environments using the Food and beverage Marketing Assessment Tool for Settings (FoodMATS).

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Storey, Kate; Mâsse, Louise C; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Children's recreational sport settings typically sell energy dense, low nutrient products; however, it is unknown whether the same types of food and beverages are also marketed in these settings. Understanding food marketing in sports settings is important because the food industry often uses the promotion of physical activity to justify their products. This study aimed to document the 'exposure' and 'power' of food marketing present in public recreation facilities in Canada and assess differences between provinces with and without voluntary provincial nutrition guidelines for recreation facilities. Food marketing was measured in 51 sites using the Food and beverage Marketing Assessment Tool for Settings (FoodMATS). The frequency and repetition ('exposure') of food marketing and the presence of select marketing techniques, including child-targeted, sports-related, size, and healthfulness ('power'), were assessed. Differences in 'exposure' and 'power' characteristics between sites in three guideline provinces (n = 34) and a non-guideline province (n = 17) were assessed using Pearson's Chi squared tests of homogeneity and Mann-Whitney U tests. Ninety-eight percent of sites had food marketing present. The frequency of food marketing per site did not differ between guideline and non-guideline provinces (median = 29; p = 0.576). Sites from guideline provinces had a significantly lower proportion of food marketing occasions that were "Least Healthy" (47.9%) than sites from the non-guideline province (73.5%; p food marketing techniques was significantly higher in sites from guideline provinces (9.5% and 10.9%, respectively), than in the non-guideline province (1.9% and 4.5% respectively; p values food marketing. Having voluntary provincial nutrition guidelines that recommend provision of healthier foods was not related to the frequency of food marketing in recreation facilities but was associated with less frequent marketing of unhealthy foods. Policy

  8. MAGMA: generalized gene-set analysis of GWAS data.

    Science.gov (United States)

    de Leeuw, Christiaan A; Mooij, Joris M; Heskes, Tom; Posthuma, Danielle

    2015-04-01

    By aggregating data for complex traits in a biologically meaningful way, gene and gene-set analysis constitute a valuable addition to single-marker analysis. However, although various methods for gene and gene-set analysis currently exist, they generally suffer from a number of issues. Statistical power for most methods is strongly affected by linkage disequilibrium between markers, multi-marker associations are often hard to detect, and the reliance on permutation to compute p-values tends to make the analysis computationally very expensive. To address these issues we have developed MAGMA, a novel tool for gene and gene-set analysis. The gene analysis is based on a multiple regression model, to provide better statistical performance. The gene-set analysis is built as a separate layer around the gene analysis for additional flexibility. This gene-set analysis also uses a regression structure to allow generalization to analysis of continuous properties of genes and simultaneous analysis of multiple gene sets and other gene properties. Simulations and an analysis of Crohn's Disease data are used to evaluate the performance of MAGMA and to compare it to a number of other gene and gene-set analysis tools. The results show that MAGMA has significantly more power than other tools for both the gene and the gene-set analysis, identifying more genes and gene sets associated with Crohn's Disease while maintaining a correct type 1 error rate. Moreover, the MAGMA analysis of the Crohn's Disease data was found to be considerably faster as well.

  9. Formula student suspension setup and laptime simulation tool

    NARCIS (Netherlands)

    van den Heuvel, E.; Besselink, I.J.M.; Nijmeijer, H.

    2013-01-01

    In motorsports time is usually limited. With use of dedicated tools for measuring wheel alignment, camber, ride heights etc. setting up the car can be done fast and consistent. With the setup sequence and tools described in this report, progress has been made in the time it takes to set up the car.

  10. Development of a tool to measure person-centered maternity care in developing settings: validation in a rural and urban Kenyan population.

    Science.gov (United States)

    Afulani, Patience A; Diamond-Smith, Nadia; Golub, Ginger; Sudhinaraset, May

    2017-09-22

    Person-centered reproductive health care is recognized as critical to improving reproductive health outcomes. Yet, little research exists on how to operationalize it. We extend the literature in this area by developing and validating a tool to measure person-centered maternity care. We describe the process of developing the tool and present the results of psychometric analyses to assess its validity and reliability in a rural and urban setting in Kenya. We followed standard procedures for scale development. First, we reviewed the literature to define our construct and identify domains, and developed items to measure each domain. Next, we conducted expert reviews to assess content validity; and cognitive interviews with potential respondents to assess clarity, appropriateness, and relevance of the questions. The questions were then refined and administered in surveys; and survey results used to assess construct and criterion validity and reliability. The exploratory factor analysis yielded one dominant factor in both the rural and urban settings. Three factors with eigenvalues greater than one were identified for the rural sample and four factors identified for the urban sample. Thirty of the 38 items administered in the survey were retained based on the factors loadings and correlation between the items. Twenty-five items load very well onto a single factor in both the rural and urban sample, with five items loading well in either the rural or urban sample, but not in both samples. These 30 items also load on three sub-scales that we created to measure dignified and respectful care, communication and autonomy, and supportive care. The Chronbach alpha for the main scale is greater than 0.8 in both samples, and that for the sub-scales are between 0.6 and 0.8. The main scale and sub-scales are correlated with global measures of satisfaction with maternity services, suggesting criterion validity. We present a 30-item scale with three sub-scales to measure person

  11. Automated Experiments on Ad Privacy Settings

    Directory of Open Access Journals (Sweden)

    Datta Amit

    2015-04-01

    Full Text Available To partly address people’s concerns over web tracking, Google has created the Ad Settings webpage to provide information about and some choice over the profiles Google creates on users. We present AdFisher, an automated tool that explores how user behaviors, Google’s ads, and Ad Settings interact. AdFisher can run browser-based experiments and analyze data using machine learning and significance tests. Our tool uses a rigorous experimental design and statistical analysis to ensure the statistical soundness of our results. We use AdFisher to find that the Ad Settings was opaque about some features of a user’s profile, that it does provide some choice on ads, and that these choices can lead to seemingly discriminatory ads. In particular, we found that visiting webpages associated with substance abuse changed the ads shown but not the settings page. We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. We cannot determine who caused these findings due to our limited visibility into the ad ecosystem, which includes Google, advertisers, websites, and users. Nevertheless, these results can form the starting point for deeper investigations by either the companies themselves or by regulatory bodies.

  12. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Science.gov (United States)

    Alves, Mara L.; Brites, Cláudia; Paulo, Manuel; Carbas, Bruna; Belo, Maria; Mendes-Moreira, Pedro M. R.; Brites, Carla; Bronze, Maria do Rosário; Gunjača, Jerko; Šatović, Zlatko; Vaz Patto, Maria C.

    2017-01-01

    Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber), flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds). These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI) model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds) could still be found. Regarding the agronomic performance, farmers' maize populations

  13. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Directory of Open Access Journals (Sweden)

    Mara L. Alves

    2017-12-01

    Full Text Available Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber, flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds. These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds could still be found. Regarding the agronomic performance, farmers

  14. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  15. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, Jr., Cressie E.; Gorin, Andrew H.; Seals, Roland D.

    1994-01-01

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  16. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, C.E. Jr.; Gorin, A.H.; Seals, R.D.

    1994-11-22

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  17. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand.

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-12-01

    Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. An eco-friendly dengue vector control programme was successfully implemented in

  18. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-01-01

    Background Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Methodology Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Results Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. Conclusion An eco-friendly dengue vector control

  19. SplitDist—Calculating Split-Distances for Sets of Trees

    DEFF Research Database (Denmark)

    Mailund, T

    2004-01-01

    We present a tool for comparing a set of input trees, calculating for each pair of trees the split-distances, i.e., the number of splits in one tree not present in the other.......We present a tool for comparing a set of input trees, calculating for each pair of trees the split-distances, i.e., the number of splits in one tree not present in the other....

  20. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  1. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  2. Tool set for distributed real-time machine control

    Science.gov (United States)

    Carrott, Andrew J.; Wright, Christopher D.; West, Andrew A.; Harrison, Robert; Weston, Richard H.

    1997-01-01

    Demands for increased control capabilities require next generation manufacturing machines to comprise intelligent building elements, physically located at the point where the control functionality is required. Networks of modular intelligent controllers are increasingly designed into manufacturing machines and usable standards are slowly emerging. To implement a control system using off-the-shelf intelligent devices from multi-vendor sources requires a number of well defined activities, including (a) the specification and selection of interoperable control system components, (b) device independent application programming and (c) device configuration, management, monitoring and control. This paper briefly discusses the support for the above machine lifecycle activities through the development of an integrated computing environment populated with an extendable software toolset. The toolset supports machine builder activities such as initial control logic specification, logic analysis, machine modeling, mechanical verification, application programming, automatic code generation, simulation/test, version control, distributed run-time support and documentation. The environment itself consists of system management tools and a distributed object-oriented database which provides storage for the outputs from machine lifecycle activities and specific target control solutions.

  3. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  4. CSS Preprocessing: Tools and Automation Techniques

    Directory of Open Access Journals (Sweden)

    Ricardo Queirós

    2018-01-01

    Full Text Available Cascading Style Sheets (CSS is a W3C specification for a style sheet language used for describing the presentation of a document written in a markup language, more precisely, for styling Web documents. However, in the last few years, the landscape for CSS development has changed dramatically with the appearance of several languages and tools aiming to help developers build clean, modular and performance-aware CSS. These new approaches give developers mechanisms to preprocess CSS rules through the use of programming constructs, defined as CSS preprocessors, with the ultimate goal to bring those missing constructs to the CSS realm and to foster stylesheets structured programming. At the same time, a new set of tools appeared, defined as postprocessors, for extension and automation purposes covering a broad set of features ranging from identifying unused and duplicate code to applying vendor prefixes. With all these tools and techniques in hands, developers need to provide a consistent workflow to foster CSS modular coding. This paper aims to present an introductory survey on the CSS processors. The survey gathers information on a specific set of processors, categorizes them and compares their features regarding a set of predefined criteria such as: maturity, coverage and performance. Finally, we propose a basic set of best practices in order to setup a simple and pragmatic styling code workflow.

  5. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  6. A systematic and practical method for selecting systems engineering tools

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2017-01-01

    analyses of the actual needs and the available tools. Grouping needs into categories, allow us to obtain a comprehensive set of requirements for the tools. The entire model-based systems engineering discipline was categorized for a modeling tool case to enable development of a tool specification...... in successful operation since 2013 at GN Hearing. We further utilized the method to select a set of tools that we used on pilot cases at GN Hearing for modeling, simulating and formally verifying embedded systems.......The complexity of many types of systems has grown considerably over the last decades. Using appropriate systems engineering tools therefore becomes increasingly important. Starting the tool selection process can be intimidating because organizations often only have a vague idea about what they need...

  7. Construct Maps as a Foundation for Standard Setting

    Science.gov (United States)

    Wyse, Adam E.

    2013-01-01

    Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…

  8. Applications of Soft Sets in -Algebras

    Directory of Open Access Journals (Sweden)

    N. O. Alshehri

    2013-01-01

    Full Text Available In 1999, Molodtsov introduced the concept of soft set theory as a general mathematical tool for dealing with uncertainty and vagueness. In this paper, we apply the concept of soft sets to K-algebras and investigate some properties of Abelian soft K-algebras. We also introduce the concept of soft intersection K-algebras and investigate some of their properties.

  9. An evaluation system of the setting up of predictive maintenance programmes

    International Nuclear Information System (INIS)

    Carnero, MaCarmen

    2006-01-01

    Predictive Maintenance can provide an increase in safety, quality and availability in industrial plants. However, the setting up of a Predictive Maintenance Programme is a strategic decision that until now has lacked analysis of questions related to its setting up, management and control. In this paper, an evaluation system is proposed that carries out the decision making in relation to the feasibility of the setting up. The evaluation system uses a combination of tools belonging to operational research such as: Analytic Hierarchy Process, decision rules and Bayesian tools. This system is a help tool available to the managers of Predictive Maintenance Programmes which can both increase the number of Predictive Maintenance Programmes set up and avoid the failure of these programmes. The Evaluation System has been tested in a petrochemical plant and in a food industry

  10. Developing a verification tool for calculations dissemination through COBAYA

    International Nuclear Information System (INIS)

    Sabater Alcaraz, A.; Rucabado Rucabado, G.; Cuervo Gomez, D.; Garcia Herranz, N.

    2014-01-01

    The development of a software tool that automates the comparison of results with previous versions of the code and results using models of accuracy is crucial for implementing the code new functionalities. The work presented here has been the generation the mentioned tool and the set of reference cases that have set up the afore mentioned matrix. (Author)

  11. setsApp: Set operations for Cytoscape Nodes and Edges [v1; ref status: indexed, http://f1000r.es/3ml

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2014-07-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  12. Compilation Tool Chains and Intermediate Representations

    DEFF Research Database (Denmark)

    Mottin, Julien; Pacull, François; Keryell, Ronan

    2014-01-01

    In SMECY, we believe that an efficient tool chain could only be defined when the type of parallelism required by an application domain and the hardware architecture is fixed. Furthermore, we believe that once a set of tools is available, it is possible with reasonable effort to change hardware ar...

  13. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  14. Development of the policy indicator checklist: a tool to identify and measure policies for calorie-dense foods and sugar-sweetened beverages across multiple settings.

    Science.gov (United States)

    Lee, Rebecca E; Hallett, Allen M; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O'Connor, Daniel P

    2015-05-01

    We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies.

  15. Faculty Use of Tablet PCs in Teacher Education and K-12 Settings

    Science.gov (United States)

    Steinweg, Sue Byrd; Williams, Sarah Carver; Stapleton, Joy Neal

    2010-01-01

    As new technological tools emerge almost daily, students in public school and university settings are becoming increasingly technologically savvy. Faculty members in both settings have the opportunity to explore tools that have the potential to be valuable resources in a variety of educational environments. The Tablet PC is an example of one such…

  16. HIFSuite: Tools for HDL Code Conversion and Manipulation

    Directory of Open Access Journals (Sweden)

    Bombieri Nicola

    2010-01-01

    Full Text Available Abstract HIFSuite ia a set of tools and application programming interfaces (APIs that provide support for modeling and verification of HW/SW systems. The core of HIFSuite is the HDL Intermediate Format (HIF language upon which a set of front-end and back-end tools have been developed to allow the conversion of HDL code into HIF code and vice versa. HIFSuite allows designers to manipulate and integrate heterogeneous components implemented by using different hardware description languages (HDLs. Moreover, HIFSuite includes tools, which rely on HIF APIs, for manipulating HIF descriptions in order to support code abstraction/refinement and postrefinement verification.

  17. Hop pellets as an interesting source of antioxidant active compounds

    Directory of Open Access Journals (Sweden)

    Andrea Holubková

    2013-02-01

    Full Text Available Hop is a plant used by humankind for thousands of years. This plant is one of the main and indispensable raw materials for the beer production. It is used for various dishes preparation in the cuisine. Hop is also used to inhibit bacterial contamination. The hop extracts are used for its sedative, antiseptic and antioxidant properties in medicine, as a part of many phytopharmaceuticals. The present paper have focused on the extraction of polyphenolic compounds from 4 samples of hop pellets varieties of Aurora, Saaz, Lublin and Saphir, on the analyzing of bioactive substances (polyphenolics and flavonoids in prepared extracts and on the determination of antioxidant activity.  The highest content of polyphenolic substances was determined in the sample Lublin (153.06 mg gallic acid (GAE/g and Saaz (151.87 mg GAE/g. The amount of flavonoids in the samples  was descending order Saaz > Saphir > Aurora > Lublin. Hops, as plant, is known by high content of antioxidant active substances. Antioxidant activity was determined using three independent spectrofotometric methods, radical scavenging assays using 2,2′-azino-bis-3-ethylbenzthiazoline-6-sulphonic acid (ABTS and 1,1-diphenyl-2-picrylhydrazyl (DPPH radical and ferric reducing antioxidant power (FRAP. The sample Aurora showed the highest ability to scavenge of ABTS radical cation. Antioxidant activity continued to decline in a row Saphir> Lublin> Saaz. The same trend was also observed by using the FRAP assay. The most effective DPPH radical scavengering activity had the sample Saaz a Saphir (p>0.05.doi:10.5219/270 Normal 0 21 false false false SK X-NONE X-NONE

  18. Powered mobility intervention: understanding the position of tool use learning as part of implementing the ALP tool.

    Science.gov (United States)

    Nilsson, Lisbeth; Durkin, Josephine

    2017-10-01

    To explore the knowledge necessary for adoption and implementation of the Assessment of Learning Powered mobility use (ALP) tool in different practice settings for both adults and children. To consult with a diverse population of professionals working with adults and children, in different countries and various settings; who were learning about or using the ALP tool, as part of exploring and implementing research findings. Classical grounded theory with a rigorous comparative analysis of data from informants together with reflections on our own rich experiences of powered mobility practice and comparisons with the literature. A core category learning tool use and a new theory of cognizing tool use, with its interdependent properties: motivation, confidence, permissiveness, attentiveness and co-construction has emerged which explains in greater depth what enables the application of the ALP tool. The scientific knowledge base on tool use learning and the new theory conveys the information necessary for practitioner's cognizing how to apply the learning approach of the ALP tool in order to enable tool use learning through powered mobility practice as a therapeutic intervention in its own right. This opens up the possibility for more children and adults to have access to learning through powered mobility practice. Implications for rehabilitation Tool use learning through powered mobility practice is a therapeutic intervention in its own right. Powered mobility practice can be used as a rehabilitation tool with individuals who may not need to become powered wheelchair users. Motivation, confidence, permissiveness, attentiveness and co-construction are key properties for enabling the application of the learning approach of the ALP tool. Labelling and the use of language, together with honing observational skills through viewing video footage, are key to developing successful learning partnerships.

  19. BURT: back up and restore tool

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, N.T.

    1994-11-01

    BURT is just one of the tools in the Experimental Physics Industrial Control System (EPICS). In this document we address the problem of backing up and restoring sets of values in databases whose values are continuously changing. In doing so, we present the Back Up and Restore Tool (BURT). In this presentation we provide a theoretical framework that defines the problem and lays the foundation for its solution. BURT is a tool designed and implemented with respect to that theoretical framework. It is not necessary for users of BURT to have an understanding of that framework. It was included in this document only for the purpose of completeness. BURT`s basic purpose is to back up sets of values so that they can be later restored. Each time a back up is requested, a new ASCII file is generated. Further, the data values are stored as ASCII strings and therefore not compressed. Both of these facts conspire against BURT as a candidate for an archiver. Users who need an archiver should use a different tool, the Archiver.

  20. Performative Tools and Collaborative Learning

    DEFF Research Database (Denmark)

    Minder, Bettina; Lassen, Astrid Heidemann

    of performative tools used in transdisciplinary events for collaborative learning. The results of this single case study add to extant knowledge- and learning literature by providing the reader with a rich description of characteristics and learning functions of performative tools in transdisciplinary events......The use of performative tools can support collaborative learning across knowledge domains (i.e. science and practice), because they create new spaces for dialog. However, so far innovation literature provides little answers to the important discussion of how to describe the effects and requirements...... and a description of how they interrelate with the specific setting of such an event. Furthermore, they complement previous findings by relating performative tools to collaborative learning for knowledge intensive ideas....

  1. Hypogeal geological survey in the "Grotta del Re Tiberio" natural cave (Apennines, Italy): a valid tool for reconstructing the structural setting

    Science.gov (United States)

    Ghiselli, Alice; Merazzi, Marzio; Strini, Andrea; Margutti, Roberto; Mercuriali, Michele

    2011-06-01

    As karst systems are natural windows to the underground, speleology, combined with geological surveys, can be useful tools for helping understand the geological evolution of karst areas. In order to enhance the reconstruction of the structural setting in a gypsum karst area (Vena del Gesso, Romagna Apennines), a detailed analysis has been carried out on hypogeal data. Structural features (faults, fractures, tectonic foliations, bedding) have been mapped in the "Grotta del Re Tiberio" cave, in the nearby gypsum quarry tunnels and open pit benches. Five fracture systems and six fault systems have been identified. The fault systems have been further analyzed through stereographic projections and geometric-kinematic evaluations in order to reconstruct the relative chronology of these structures. This analysis led to the detection of two deformation phases. The results permitted linking of the hypogeal data with the surface data both at a local and regional scale. At the local scale, fracture data collected in the underground have been compared with previous authors' surface data coming from the quarry area. The two data sets show a very good correspondence, as every underground fracture system matches with one of the surface fracture system. Moreover, in the cave, a larger number of fractures belonging to each system could be mapped. At the regional scale, the two deformation phases detected can be integrated in the structural setting of the study area, thereby enhancing the tectonic interpretation of the area ( e.g., structures belonging to a new deformation phase, not reported before, have been identified underground). The structural detailed hypogeal survey has, thus, provided very useful data, both by integrating the existing information and revealing new data not detected at the surface. In particular, some small structures ( e.g., displacement markers and short fractures) are better preserved in the hypogeal environment than on the surface where the outcropping

  2. Setting research priorities by applying the combined approach matrix.

    Science.gov (United States)

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  3. Coupling of 3-D core computational codes and a reactor simulation software for the computation of PWR reactivity accidents induced by thermal-hydraulic transients

    International Nuclear Information System (INIS)

    Raymond, P.; Caruge, D.; Paik, H.J.

    1994-01-01

    The French CEA has recently developed a set of new computer codes for reactor physics computations called the Saphir system which includes CRONOS-2, a three-dimensional neutronic code, FLICA-4, a three-dimensional core thermal hydraulic code, and FLICA-S, a primary loops thermal-hydraulic transient computation code, which are coupled and applied to analyze a severe reactivity accident induced by a thermal hydraulic transient: the Steamline Break accident for a pressurized water reactor until soluble boron begins to accumulate in the core. The coupling of these codes has proved to be numerically stable. 15 figs., 7 refs

  4. Seven Basic Tools of Quality Control: An Appropriate Tools for Solving Quality Problems in the Organizations

    OpenAIRE

    Neyestani, Behnam

    2017-01-01

    Dr. Kaoru Ishikawa was first total quality management guru, who has been associated with the development and advocacy of using the seven quality control (QC) tools in the organizations for problem solving and process improvements. Seven old quality control tools are a set of the QC tools that can be used for improving the performance of the production processes, from the first step of producing a product or service to the last stage of production. So, the general purpose of this paper was to...

  5. Results of the mock-up experiment on partial LOCA

    International Nuclear Information System (INIS)

    Dreier, J.; Winkler, H.

    1985-01-01

    A mockup experiment has been performed to verify the heat transfer model for a partial loss of coolant accident in the swimming pool reactor SAPHIR. Three coolant channels with the same dimensions as in a SAPHIR fuel element were simulated using four electrically heated plates. For a water level such that the heated plates are partially submerged, plate temperatures remain below 160 deg. C for plate powers of up to 650 W. For water levels low enough to just block the channels, plate temperatures of 400 deg. C are reached for plate powers as low as 60 W. Details of the experiment and further results are discussed. (author)

  6. Results of the mockup experiment on partial LOCA

    International Nuclear Information System (INIS)

    Dreier, J.; Winkler, H.

    1985-01-01

    A mockup experiment has been performed to verify the heat transfer model for a partial loss of coolant accident in the swimming pool reactor SAPHIR. Three coolant channels with the same dimensions as in a SAPHIR fuel element were simulated using four electrically heated plates. For a water level such that the heated plates are partially submerged, plate temperatures remain below 160 0 C for plate powers of up to 650 W. For water levels low enough to just block the channels, plate temperatures of 400 0 C are reached for plate powers as low as 60 W. Details of the experiment and further results are discussed

  7. Hierarchical sets: analyzing pangenome structure through scalable set visualizations

    Science.gov (United States)

    2017-01-01

    Abstract Motivation: The increase in available microbial genome sequences has resulted in an increase in the size of the pangenomes being analyzed. Current pangenome visualizations are not intended for the pangenome sizes possible today and new approaches are necessary in order to convert the increase in available information to increase in knowledge. As the pangenome data structure is essentially a collection of sets we explore the potential for scalable set visualization as a tool for pangenome analysis. Results: We present a new hierarchical clustering algorithm based on set arithmetics that optimizes the intersection sizes along the branches. The intersection and union sizes along the hierarchy are visualized using a composite dendrogram and icicle plot, which, in pangenome context, shows the evolution of pangenome and core size along the evolutionary hierarchy. Outlying elements, i.e. elements whose presence pattern do not correspond with the hierarchy, can be visualized using hierarchical edge bundles. When applied to pangenome data this plot shows putative horizontal gene transfers between the genomes and can highlight relationships between genomes that is not represented by the hierarchy. We illustrate the utility of hierarchical sets by applying it to a pangenome based on 113 Escherichia and Shigella genomes and find it provides a powerful addition to pangenome analysis. Availability and Implementation: The described clustering algorithm and visualizations are implemented in the hierarchicalSets R package available from CRAN (https://cran.r-project.org/web/packages/hierarchicalSets) Contact: thomasp85@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28130242

  8. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  9. Tools for Supporting Distributed Agile Project Planning

    Science.gov (United States)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  10. Physics Mining of Multi-Source Data Sets

    Science.gov (United States)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  11. Proteome-wide Structural Analysis of PTM Hotspots Reveals Regulatory Elements Predicted to Impact Biological Function and Disease*

    Science.gov (United States)

    Dewhurst, Henry; Sundararaman, Niveda

    2016-01-01

    Post-translational modifications (PTMs) regulate protein behavior through modulation of protein-protein interactions, enzymatic activity, and protein stability essential in the translation of genotype to phenotype in eukaryotes. Currently, less than 4% of all eukaryotic PTMs are reported to have biological function - a statistic that continues to decrease with an increasing rate of PTM detection. Previously, we developed SAPH-ire (Structural Analysis of PTM Hotspots) - a method for the prioritization of PTM function potential that has been used effectively to reveal novel PTM regulatory elements in discrete protein families (Dewhurst et al., 2015). Here, we apply SAPH-ire to the set of eukaryotic protein families containing experimental PTM and 3D structure data - capturing 1,325 protein families with 50,839 unique PTM sites organized into 31,747 modified alignment positions (MAPs), of which 2010 (∼6%) possess known biological function. Here, we show that using an artificial neural network model (SAPH-ire NN) trained to identify MAP hotspots with biological function results in prediction outcomes that far surpass the use of single hotspot features, including nearest neighbor PTM clustering methods. We find the greatest enhancement in prediction for positions with PTM counts of five or less, which represent 98% of all MAPs in the eukaryotic proteome and 90% of all MAPs found to have biological function. Analysis of the top 1092 MAP hotspots revealed 267 of truly unknown function (containing 5443 distinct PTMs). Of these, 165 hotspots could be mapped to human KEGG pathways for normal and/or disease physiology. Many high-ranking hotspots were also found to be disease-associated pathogenic sites of amino acid substitution despite the lack of observable PTM in the human protein family member. Taken together, these experiments demonstrate that the functional relevance of a PTM can be predicted very effectively by neural network models, revealing a large but testable

  12. Proteome-wide Structural Analysis of PTM Hotspots Reveals Regulatory Elements Predicted to Impact Biological Function and Disease.

    Science.gov (United States)

    Torres, Matthew P; Dewhurst, Henry; Sundararaman, Niveda

    2016-11-01

    Post-translational modifications (PTMs) regulate protein behavior through modulation of protein-protein interactions, enzymatic activity, and protein stability essential in the translation of genotype to phenotype in eukaryotes. Currently, less than 4% of all eukaryotic PTMs are reported to have biological function - a statistic that continues to decrease with an increasing rate of PTM detection. Previously, we developed SAPH-ire (Structural Analysis of PTM Hotspots) - a method for the prioritization of PTM function potential that has been used effectively to reveal novel PTM regulatory elements in discrete protein families (Dewhurst et al., 2015). Here, we apply SAPH-ire to the set of eukaryotic protein families containing experimental PTM and 3D structure data - capturing 1,325 protein families with 50,839 unique PTM sites organized into 31,747 modified alignment positions (MAPs), of which 2010 (∼6%) possess known biological function. Here, we show that using an artificial neural network model (SAPH-ire NN) trained to identify MAP hotspots with biological function results in prediction outcomes that far surpass the use of single hotspot features, including nearest neighbor PTM clustering methods. We find the greatest enhancement in prediction for positions with PTM counts of five or less, which represent 98% of all MAPs in the eukaryotic proteome and 90% of all MAPs found to have biological function. Analysis of the top 1092 MAP hotspots revealed 267 of truly unknown function (containing 5443 distinct PTMs). Of these, 165 hotspots could be mapped to human KEGG pathways for normal and/or disease physiology. Many high-ranking hotspots were also found to be disease-associated pathogenic sites of amino acid substitution despite the lack of observable PTM in the human protein family member. Taken together, these experiments demonstrate that the functional relevance of a PTM can be predicted very effectively by neural network models, revealing a large but testable

  13. A simulator tool set for evaluating HEVC/SHVC streaming

    Science.gov (United States)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  14. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  15. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  16. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface......Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  17. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  18. Set-Up and Punchline as Figure and Ground

    DEFF Research Database (Denmark)

    Keisalo, Marianna Päivikki

    the two that cannot be resolved by appeal to either set-up or punchline, but traps thought between them in an ‘epistemological problem’ as comedian Louis CK put it. For comedians, set-ups and punchlines are basic tools, practical and concrete ways to create and organize material. They are also familiar...

  19. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    Science.gov (United States)

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  20. Studies of prehistoric flint tools by PIXE

    International Nuclear Information System (INIS)

    Smit, Z.

    2002-01-01

    The trace elements preserved on sharp edges of stone tools may provide some information about the worked material, which in turn may serve for the reconstruction of users way of life. Since the amount of the deposited worked material is minute, it can only be detected by sensitive fluorescence techniques, induced by electrons in the electron microscopes, or by light ions from the particle accelerators (PIXE). The trace element deposition was studied by PIXE for a set of experimental tools used for working bone and wood, and for a set of archaeological artefacts dating from the late paleolithic till neolithic period. (author)

  1. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  2. Method for automation of tool preproduction

    Science.gov (United States)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  3. setsApp for Cytoscape: Set operations for Cytoscape Nodes and Edges [v2; ref status: indexed, http://f1000r.es/5lz

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2015-08-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. Automatic set partitioning and layout functions are also provided. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  4. Updating risk prediction tools: a case study in prostate cancer.

    Science.gov (United States)

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  6. Teleconferencing in Medical Education: A Useful Tool

    Directory of Open Access Journals (Sweden)

    Lamba Pankaj

    2011-08-01

    Full Text Available Education and healthcare are basic needs for humandevelopment. Technological innovation has broadened theaccess to higher quality healthcare and education withoutregard to time, distance or geopolitical boundaries. Distancelearning has gained popularity as a means of learning inrecent years due to widely distributed learners, busyschedules and rising travel costs. Teleconferencing is also avery useful tool as a distance learning method.Teleconferencing is a real-time and live interactiveprogramme in which one set of participants are at one ormore locations and the other set of participants are atanother. The teleconference allows for interaction,including audio and/or video, and possibly other modalities,between at least two sites. Various methods are availablefor setting up a teleconferencing unit. A detailed review ofthe trend in the use of teleconferencing in medicaleducation was conducted using Medline and a literaturesearch.Teleconferencing was found to be a very useful tool incontinuing medical education (CME, postgraduate medicaleducation, undergraduate medical education,telementoring and many other situations. The use ofteleconferencing in medical education has many advantagesincluding savings in terms of travel costs and time. It givesaccess to the best educational resources and experiencewithout any limitations of boundaries of distance and time.It encourages two-way interactions and facilitates learningin adults. Despite having some pitfalls in its implementationit is now being seen as an important tool in facilitatinglearning in medicine and many medical schools andinstitutions are adapting this novel tool.

  7. Studying the Complex Expression Dependences between Sets of Coexpressed Genes

    Directory of Open Access Journals (Sweden)

    Mario Huerta

    2014-01-01

    Full Text Available Organisms simplify the orchestration of gene expression by coregulating genes whose products function together in the cell. The use of clustering methods to obtain sets of coexpressed genes from expression arrays is very common; nevertheless there are no appropriate tools to study the expression networks among these sets of coexpressed genes. The aim of the developed tools is to allow studying the complex expression dependences that exist between sets of coexpressed genes. For this purpose, we start detecting the nonlinear expression relationships between pairs of genes, plus the coexpressed genes. Next, we form networks among sets of coexpressed genes that maintain nonlinear expression dependences between all of them. The expression relationship between the sets of coexpressed genes is defined by the expression relationship between the skeletons of these sets, where this skeleton represents the coexpressed genes with a well-defined nonlinear expression relationship with the skeleton of the other sets. As a result, we can study the nonlinear expression relationships between a target gene and other sets of coexpressed genes, or start the study from the skeleton of the sets, to study the complex relationships of activation and deactivation between the sets of coexpressed genes that carry out the different cellular processes present in the expression experiments.

  8. Long Term Care Minimum Data Set (MDS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Long-Term Care Minimum Data Set (MDS) is a standardized, primary screening and assessment tool of health status that forms the foundation of the comprehensive...

  9. Diagnostic accuracy of WHO verbal autopsy tool for ascertaining causes of neonatal deaths in the urban setting of Pakistan: a hospital-based prospective study.

    Science.gov (United States)

    Soofi, Sajid Bashir; Ariff, Shabina; Khan, Ubaidullah; Turab, Ali; Khan, Gul Nawaz; Habib, Atif; Sadiq, Kamran; Suhag, Zamir; Bhatti, Zaid; Ahmed, Imran; Bhal, Rajiv; Bhutta, Zulfiqar Ahmed

    2015-10-05

    Globally, clinical certification of the cause of neonatal death is not commonly available in developing countries. Under such circumstances it is imperative to use available WHO verbal autopsy tool to ascertain causes of death for strategic health planning in countries where resources are limited and the burden of neonatal death is high. The study explores the diagnostic accuracy of WHO revised verbal autopsy tool for ascertaining the causes of neonatal deaths against reference standard diagnosis obtained from standardized clinical and supportive hospital data. All neonatal deaths were recruited between August 2006 -February 2008 from two tertiary teaching hospitals in Province Sindh, Pakistan. The reference standard cause of death was established by two senior pediatricians within 2 days of occurrence of death using the International Cause of Death coding system. For verbal autopsy, trained female community health worker interviewed mother or care taker of the deceased within 2-6 weeks of death using a modified WHO verbal autopsy tool. Cause of death was assigned by 2 trained pediatricians. The performance was assessed in terms of sensitivity and specificity. Out of 626 neonatal deaths, cause-specific mortality fractions for neonatal deaths were almost similar in both verbal autopsy and reference standard diagnosis. Sensitivity of verbal autopsy was more than 93% for diagnosing prematurity and 83.5% for birth asphyxia. However the verbal autopsy didn't have acceptable accuracy for diagnosing the congenital malformation 57%. The specificity for all five major causes of neonatal deaths was greater than 90%. The WHO revised verbal autopsy tool had reasonable validity in determining causes of neonatal deaths. The tool can be used in resource limited community-based settings where neonatal mortality rate is high and death certificates from hospitals are not available.

  10. NIKE: a new clinical tool for establishing levels of indications for cataract surgery.

    Science.gov (United States)

    Lundström, Mats; Albrecht, Susanne; Håkansson, Ingemar; Lorefors, Ragnhild; Ohlsson, Sven; Polland, Werner; Schmid, Andrea; Svensson, Göran; Wendel, Eva

    2006-08-01

    The purpose of this study was to construct a new clinical tool for establishing levels of indications for cataract surgery, and to validate this tool. Teams from nine eye clinics reached an agreement about the need to develop a clinical tool for setting levels of indications for cataract surgery and about the items that should be included in the tool. The tool was to be called 'NIKE' (Nationell Indikationsmodell för Kataraktextraktion). The Canadian Cataract Priority Criteria Tool served as a model for the NIKE tool, which was modified for Swedish conditions. Items included in the tool were visual acuity of both eyes, patients' perceived difficulties in day-to-day life, cataract symptoms, the ability to live independently, and medical/ophthalmic reasons for surgery. The tool was validated and tested in 343 cataract surgery patients. Validity, stability and reliability were tested and the outcome of surgery was studied in relation to the indication setting. Four indication groups (IGs) were suggested. The group with the greatest indications for surgery was named group 1 and that with the lowest, group 4. Validity was proved to be good. Surgery had the greatest impact on the group with the highest indications for surgery. Test-retest reliability test and interexaminer tests of indication settings showed statistically significant intraclass correlations (intraclass correlation coefficients [ICCs] 0.526 and 0.923, respectively). A new clinical tool for indication setting in cataract surgery is presented. This tool, the NIKE, takes into account both visual acuity and the patient's perceived problems in day-to-day life because of cataract. The tool seems to be stable and reliable and neutral towards different examiners.

  11. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  12. Disease management index of potential years of life lost as a tool for setting priorities in national disease control using OECD health data.

    Science.gov (United States)

    Jang, Sung-In; Nam, Jung-Mo; Choi, Jongwon; Park, Eun-Cheol

    2014-03-01

    Limited healthcare resources make it necessary to maximize efficiency in disease management at the country level by priority-setting according to disease burden. To make the best priority settings, it is necessary to measure health status and have standards for its judgment, as well as consider disease management trends among nations. We used 17 International Classification of Diseases (ICD) categories of potential years of life lost (YPLL) from Organization for Economic Co-operation and Development (OECD) health data for 2012, 37 disease diagnoses YPLL from OECD health data for 2009 across 22 countries and disability-adjusted life years (DALY) from the World Health Organization (WHO). We set a range of 1-1 for each YPLL per disease in a nation (position value for relative comparison, PARC). Changes over 5 years were also accounted for in this disease management index (disease management index, DMI). In terms of ICD categories, the DMI indicated specific areas for priority setting for different countries with regard to managing disease treatment and diagnosis. Our study suggests that DMI is a realistic index that reflects trend changes over the past 5 years to the present state, and PARC is an easy index for identifying relative status. Moreover, unlike existing indices, DMI and PARC make it easy to conduct multiple comparisons among countries and diseases. DMI and PARC are therefore useful tools for policy implications and for future studies incorporating them and other existing indexes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Novel molecular diagnostic tools for malaria elimination: a review of options from the point of view of high-throughput and applicability in resource limited settings.

    Science.gov (United States)

    Britton, Sumudu; Cheng, Qin; McCarthy, James S

    2016-02-16

    As malaria transmission continues to decrease, an increasing number of countries will enter pre-elimination and elimination. To interrupt transmission, changes in control strategies are likely to require more accurate identification of all carriers of Plasmodium parasites, both symptomatic and asymptomatic, using diagnostic tools that are highly sensitive, high throughput and with fast turnaround times preferably performed in local health service settings. Currently available immunochromatographic lateral flow rapid diagnostic tests and field microscopy are unlikely to consistently detect infections at parasite densities less than 100 parasites/µL making them insufficiently sensitive for detecting all carriers. Molecular diagnostic platforms, such as PCR and LAMP, are currently available in reference laboratories, but at a cost both financially and in turnaround time. This review describes the recent progress in developing molecular diagnostic tools in terms of their capacity for high throughput and potential for performance in non-reference laboratories for malaria elimination.

  14. Integration of g4tools in Geant4

    International Nuclear Information System (INIS)

    Hřivnáčová, Ivana

    2014-01-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  15. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  16. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  17. Reachable Sets of Hidden CPS Sensor Attacks : Analysis and Synthesis Tools

    NARCIS (Netherlands)

    Murguia, Carlos; van de Wouw, N.; Ruths, Justin; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    For given system dynamics, control structure, and fault/attack detection procedure, we provide mathematical tools–in terms of Linear Matrix Inequalities (LMIs)–for characterizing and minimizing the set of states that sensor attacks can induce in the system while keeping the alarm rate of the

  18. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  19. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    Science.gov (United States)

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  20. A Set of Annotation Interfaces for Alignment of Parallel Corpora

    Directory of Open Access Journals (Sweden)

    Singh Anil Kumar

    2014-09-01

    Full Text Available Annotation interfaces for parallel corpora which fit in well with other tools can be very useful. We describe a set of annotation interfaces which fulfill this criterion. This set includes a sentence alignment interface, two different word or word group alignment interfaces and an initial version of a parallel syntactic annotation alignment interface. These tools can be used for manual alignment, or they can be used to correct automatic alignments. Manual alignment can be performed in combination with certain kinds of linguistic annotation. Most of these interfaces use a representation called the Shakti Standard Format that has been found to be very robust and has been used for large and successful projects. It ties together the different interfaces, so that the data created by them is portable across all tools which support this representation. The existence of a query language for data stored in this representation makes it possible to build tools that allow easy search and modification of annotated parallel data.

  1. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  2. The histogramming tool hparse

    International Nuclear Information System (INIS)

    Nikulin, V.; Shabratova, G.

    2005-01-01

    A general-purpose package aimed to simplify the histogramming in the data analysis is described. The proposed dedicated language for writing the histogramming scripts provides an effective and flexible tool for definition of a complicated histogram set. The script is more transparent and much easier to maintain than corresponding C++ code. In the TTree analysis it could be a good complement to the TTreeViewer class: the TTreeViewer is used for choice of the required histogram/cut set, while the hparse enables one to generate a code for systematic analysis

  3. Health system context and implementation of evidence-based practices-development and validation of the Context Assessment for Community Health (COACH) tool for low- and middle-income settings.

    Science.gov (United States)

    Bergström, Anna; Skeen, Sarah; Duc, Duong M; Blandon, Elmer Zelaya; Estabrooks, Carole; Gustavsson, Petter; Hoa, Dinh Thi Phuong; Källestål, Carina; Målqvist, Mats; Nga, Nguyen Thu; Persson, Lars-Åke; Pervin, Jesmin; Peterson, Stefan; Rahman, Anisur; Selling, Katarina; Squires, Janet E; Tomlinson, Mark; Waiswa, Peter; Wallin, Lars

    2015-08-15

    The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow

  4. PeptideNavigator: An interactive tool for exploring large and complex data sets generated during peptide-based drug design projects.

    Science.gov (United States)

    Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J

    2018-01-01

    There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  6. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    Science.gov (United States)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  7. Poster Abstract: Towards NILM for Industrial Settings

    DEFF Research Database (Denmark)

    Holmegaard, Emil; Kjærgaard, Mikkel Baun

    2015-01-01

    Industry consumes a large share of the worldwide electricity consumption. Disaggregated information about electricity consumption enables better decision-making and feedback tools to optimize electricity consumption. In industrial settings electricity loads consist of a variety of equipment, whic...... consumption for six months, at an industrial site. In this poster abstract we provide initial results for how industrial equipment challenge NILM algorithms. These results thereby open up for evaluating the use of NILM in industrial settings....

  8. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  9. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    Energy Technology Data Exchange (ETDEWEB)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  10. Developing Expert Tools for the LHC

    CERN Document Server

    AUTHOR|(CDS)2160780; Timkó, Helga

    2017-10-12

    This Thesis describes software tools developed for automated, precision setting-up of low-power level radio frequency (LLRF) loops, which will help expert users to have better control and faster setting-up of the radio-frequency (RF) system in the Large Hadron Collider (LHC) experiment. The aim was to completely redesign the software architecture, to add new features, to improve certain algorithms, and to increase the automation.

  11. Hope in severe disease: a review of the literature on the construct and the tools for assessing hope in the psycho-oncologic setting.

    Science.gov (United States)

    Piccinelli, Claudia; Clerici, Carlo Alfredo; Veneroni, Laura; Ferrari, Andrea; Proserpio, Tullio

    2015-01-01

    Research on the topic of hope began a long time ago but, more recently, interest in this construct has focused mainly on the development of psychometric tools for its assessment. The 2 steps of the present article are defining the construct of hope by completing a preliminary review of the literature and analyzing the tools used to assess hope in the setting of oncologic medicine, conducting a systematic review of the existing scientific literature. Our study was conducted in 2 stages. The first stage involved a nonsystematic preliminary review of the literature, the second a systematic search in all the medical journals contained in the Medline database as of 2012. The literature identified at the first stage was divided according to several topical categories, i.e., theoretical, empirical, and clinical works on the construct of hope. In the second systematic search, we identified the main psychometric tools used to measure hope in the field of clinical oncology and assessed their validity. A total of 22 articles were identified. What emerged when we pooled the findings of our 2 lines of research was that, despite its broad theoretical definitions, the construct of hope can be broken down to a few constituent elements when hope is studied using currently available psychometric tools. In particular, these identified constituent elements were coping, spiritual well-being, quality of life, distress, and depression. The factors contained in the construct of hope include temporality, future, expectancy, motivation, and interconnectedness. The review of the scientific literature does not reveal a clear definition of hope. Multidisciplinary studies are needed to communicate different perspectives (medical, psychological, spiritual, theological) among each other for better definition of the constituent elements of hope in order to support the hope with specific interventions.

  12. Identification of facilitators and barriers to residents' use of a clinical reasoning tool.

    Science.gov (United States)

    DiNardo, Deborah; Tilstra, Sarah; McNeil, Melissa; Follansbee, William; Zimmer, Shanta; Farris, Coreen; Barnato, Amber E

    2018-03-28

    While there is some experimental evidence to support the use of cognitive forcing strategies to reduce diagnostic error in residents, the potential usability of such strategies in the clinical setting has not been explored. We sought to test the effect of a clinical reasoning tool on diagnostic accuracy and to obtain feedback on its usability and acceptability. We conducted a randomized behavioral experiment testing the effect of this tool on diagnostic accuracy on written cases among post-graduate 3 (PGY-3) residents at a single internal medical residency program in 2014. Residents completed written clinical cases in a proctored setting with and without prompts to use the tool. The tool encouraged reflection on concordant and discordant aspects of each case. We used random effects regression to assess the effect of the tool on diagnostic accuracy of the independent case sets, controlling for case complexity. We then conducted audiotaped structured focus group debriefing sessions and reviewed the tapes for facilitators and barriers to use of the tool. Of 51 eligible PGY-3 residents, 34 (67%) participated in the study. The average diagnostic accuracy increased from 52% to 60% with the tool, a difference that just met the test for statistical significance in adjusted analyses (p=0.05). Residents reported that the tool was generally acceptable and understandable but did not recognize its utility for use with simple cases, suggesting the presence of overconfidence bias. A clinical reasoning tool improved residents' diagnostic accuracy on written cases. Overconfidence bias is a potential barrier to its use in the clinical setting.

  13. Gaussian process regression for tool wear prediction

    Science.gov (United States)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  14. Tools and data services registry

    DEFF Research Database (Denmark)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a...

  15. Early wound infection identification using the WIRE tool in community health care settings: An audit report.

    Science.gov (United States)

    Siaw-Sakyi, Vincent

    2017-12-01

    Wound infection is proving to be a challenge for health care professionals. The associated complications and cost of wound infection is immense and can lead to death in extreme cases. Current management of wound infection is largely subjective and relies on the knowledge of the health care professional to identify and initiate treatment. In response, we have developed an infection prediction and assessment tool. The Wound Infection Risk-Assessment and Evaluation tool (WIRE) and its management strategy is a tool with the aim to bring objectivity to infection prediction, assessment and management. A local audit carried out indicated a high infection prediction rate. More work is being done to improve its effectiveness.

  16. FACET CLASSIFICATIONS OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2013-12-01

    Full Text Available The article deals with the classification of e-learning tools based on the facet method, which suggests the separation of the parallel set of objects into independent classification groups; at the same time it is not assumed rigid classification structure and pre-built finite groups classification groups are formed by a combination of values taken from the relevant facets. An attempt to systematize the existing classification of e-learning tools from the standpoint of classification theory is made for the first time. Modern Ukrainian and foreign facet classifications of e-learning tools are described; their positive and negative features compared to classifications based on a hierarchical method are analyzed. The original author's facet classification of e-learning tools is proposed.

  17. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-01

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructed community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.

  18. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    Science.gov (United States)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  19. Development of Application Programming Tool for Safety Grade PLC (POSAFE-Q)

    International Nuclear Information System (INIS)

    Koo, Kyungmo; You, Byungyong; Kim, Tae-Wook; Cho, Sengjae; Lee, Jin S.

    2006-01-01

    The pSET (POSAFE-Q Software Engineering Tool) is an application programming tool of the POSAFE-Q which is a safety graded programmable logic controller (PLC) developed for the reactor protect system of the nuclear power plant. The pSET provides an integrated development environment (IDE) which includes editors, compiler, simulator, down loader, debugger, and monitor. The pSET supports the IEC61131-3 standard software model and languages such as LD (ladder diagram) and FBD (function block diagram) which are two of the most widely used PLC programming languages in industry fields. The pSET will also support SFC (sequential function chart) language. The pSET is developed as a part of a Korea Nuclear Instrumentation and Control System (KNICS) project

  20. The Electronic Patient Reported Outcome Tool: Testing Usability and Feasibility of a Mobile App and Portal to Support Care for Patients With Complex Chronic Disease and Disability in Primary Care Settings

    Science.gov (United States)

    Gill, Ashlinder; Khan, Anum Irfan; Hans, Parminder Kaur; Kuluski, Kerry; Cott, Cheryl

    2016-01-01

    Background People experiencing complex chronic disease and disability (CCDD) face some of the greatest challenges of any patient population. Primary care providers find it difficult to manage multiple discordant conditions and symptoms and often complex social challenges experienced by these patients. The electronic Patient Reported Outcome (ePRO) tool is designed to overcome some of these challenges by supporting goal-oriented primary care delivery. Using the tool, patients and providers collaboratively develop health care goals on a portal linked to a mobile device to help patients and providers track progress between visits. Objectives This study tested the usability and feasibility of adopting the ePRO tool into a single interdisciplinary primary health care practice in Toronto, Canada. The Fit between Individuals, Fask, and Technology (FITT) framework was used to guide our assessment and explore whether the ePRO tool is: (1) feasible for adoption in interdisciplinary primary health care practices and (2) usable from both the patient and provider perspectives. This usability pilot is part of a broader user-centered design development strategy. Methods A 4-week pilot study was conducted in which patients and providers used the ePRO tool to develop health-related goals, which patients then monitored using a mobile device. Patients and providers collaboratively set goals using the system during an initial visit and had at least 1 follow-up visit at the end of the pilot to discuss progress. Focus groups and interviews were conducted with patients and providers to capture usability and feasibility measures. Data from the ePRO system were extracted to provide information regarding tool usage. Results Six providers and 11 patients participated in the study; 3 patients dropped out mainly owing to health issues. The remaining 8 patients completed 210 monitoring protocols, equal to over 1300 questions, with patients often answering questions daily. Providers and patients

  1. Lung ultrasound as a diagnostic tool for radiographically-confirmed pneumonia in low resource settings.

    Science.gov (United States)

    Ellington, Laura E; Gilman, Robert H; Chavez, Miguel A; Pervaiz, Farhan; Marin-Concha, Julio; Compen-Chang, Patricia; Riedel, Stefan; Rodriguez, Shalim J; Gaydos, Charlotte; Hardick, Justin; Tielsch, James M; Steinhoff, Mark; Benson, Jane; May, Evelyn A; Figueroa-Quintanilla, Dante; Checkley, William

    2017-07-01

    Pneumonia is a leading cause of morbidity and mortality in children worldwide; however, its diagnosis can be challenging, especially in settings where skilled clinicians or standard imaging are unavailable. We sought to determine the diagnostic accuracy of lung ultrasound when compared to radiographically-confirmed clinical pediatric pneumonia. Between January 2012 and September 2013, we consecutively enrolled children aged 2-59 months with primary respiratory complaints at the outpatient clinics, emergency department, and inpatient wards of the Instituto Nacional de Salud del Niño in Lima, Peru. All participants underwent clinical evaluation by a pediatrician and lung ultrasonography by one of three general practitioners. We also consecutively enrolled children without respiratory symptoms. Children with respiratory symptoms had a chest radiograph. We obtained ancillary laboratory testing in a subset. Final clinical diagnoses included 453 children with pneumonia, 133 with asthma, 103 with bronchiolitis, and 143 with upper respiratory infections. In total, CXR confirmed the diagnosis in 191 (42%) of 453 children with clinical pneumonia. A consolidation on lung ultrasound, which is our primary endpoint for pneumonia, had a sensitivity of 88.5%, specificity of 100%, and an area under-the-curve of 0.94 (95% CI 0.92-0.97) when compared to radiographically-confirmed clinical pneumonia. When any abnormality on lung ultrasound was compared to radiographically-confirmed clinical pneumonia the sensitivity increased to 92.2% and the specificity decreased to 95.2%, with an area under-the-curve of 0.94 (95% CI 0.91-0.96). Lung ultrasound had high diagnostic accuracy for the diagnosis of radiographically-confirmed pneumonia. Added benefits of lung ultrasound include rapid testing and high inter-rater agreement. Lung ultrasound may serve as an alternative tool for the diagnosis of pediatric pneumonia. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights

  2. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  3. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Email-Set Visualization: Facilitating Re-Finding in Email Archives

    OpenAIRE

    Gorton, Douglas; Murthy, Uma; Vemuri, Naga Srinivas; Pérez-Quiñones, Manuel A.

    2007-01-01

    In this paper we describe ESVT – EmailSet Visualization Tool, an email archive tool that provides users a visualization to re-find and discover information in their email archive. ESVT is an end-to-end email archive tool that can be used from archiving a user’s email messages to visualizing queries on the email archive. We address email archiving by allowing import of email messages from an email server or from a standard existing email client. The central idea in ESVT’s visualization, an “em...

  5. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  6. HDF-EOS Dump Tools

    Science.gov (United States)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The

  7. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  8. Teleconferencing in medical education: a useful tool.

    Science.gov (United States)

    Lamba, Pankaj

    2011-01-01

    Education and healthcare are basic needs for human development. Technological innovation has broadened the access to higher quality healthcare and education without regard to time, distance or geopolitical boundaries. Distance learning has gained popularity as a means of learning in recent years due to widely distributed learners, busy schedules and rising travel costs. Teleconferencing is also a very useful tool as a distance learning method.Teleconferencing is a real-time and live interactive programme in which one set of participants are at one or more locations and the other set of participants are at another. The teleconference allows for interaction, including audio and/or video, and possibly other modalities, between at least two sites. Various methods are available for setting up a teleconferencing unit. A detailed review of the trend in the use of teleconferencing in medical education was conducted using Medline and a literature search.Teleconferencing was found to be a very useful tool in continuing medical education (CME), postgraduate medical education, undergraduate medical education, telementoring and many other situations. The use of teleconferencing in medical education has many advantages including savings in terms of travel costs and time. It gives access to the best educational resources and experience without any limitations of boundaries of distance and time. It encourages two-way interactions and facilitates learning in adults. Despite having some pitfalls in its implementation it is now being seen as an important tool in facilitating learning in medicine and many medical schools and institutions are adapting this novel tool.

  9. Tools for the functional interpretation of metabolomic experiments.

    Science.gov (United States)

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  10. Developing e-marketing tools : Case company: CASTA Ltd.

    OpenAIRE

    Nguyen, Chi

    2014-01-01

    This Bachelor’s thesis topic is developing e-marketing tools for the B2C sector of CASTA Ltd. The final outcome is a set of online marketing tools guidelines that can improve business activities, especially marketing effectiveness. Based on the company’s status as a novice in online marketing field, the thesis will focus on the basic level of three specific online marketing tools, instead of covering the whole e-marketing subject. The theoretical framework first describes the concept of e...

  11. X-ray em simulation tool for ptychography dataset construction

    NARCIS (Netherlands)

    Stoevelaar, L.P.; Gerini, Giampiero

    2018-01-01

    In this paper, we present an electromagnetic full-wave modeling framework, as a support EM tool providing data sets for X-ray ptychographic imaging. Modeling the entire scattering problem with Finite Element Method (FEM) tools is, in fact, a prohibitive task, because of the large area illuminated by

  12. More on neutrosophic soft rough sets and its modification

    Directory of Open Access Journals (Sweden)

    Emad Marei

    2015-12-01

    Full Text Available This paper aims to introduce and discuss anew mathematical tool for dealing with uncertainties, which is a combination of neutrosophic sets, soft sets and rough sets, namely neutrosophic soft rough set model. Also, its modification is introduced. Some of their properties are studied and supported with proved propositions and many counter examples. Some of rough relations are redefined as a neutrosophic soft rough relations. Comparisons among traditional rough model, suggested neutrosophic soft rough model and its modification, by using their properties and accuracy measures are introduced. Finally, we illustrate that, classical rough set model can be viewed as a special case of suggested models in this paper.

  13. PhysarumSoft: An update based on rough set theory

    Science.gov (United States)

    Schumann, Andrew; Pancerz, Krzysztof

    2017-07-01

    PhysarumSoft is a software tool consisting of two modules developed for programming Physarum machines and simulating Physarum games, respectively. The paper briefly discusses what has been added since the last version released in 2015. New elements in both modules are based on rough set theory. Rough sets are used to model behaviour of Physarum machines and to describe strategy games.

  14. Investigating the effect of paralogs on microarray gene-set analysis

    LENUS (Irish Health Repository)

    Faure, Andre J

    2011-01-24

    Abstract Background In order to interpret the results obtained from a microarray experiment, researchers often shift focus from analysis of individual differentially expressed genes to analyses of sets of genes. These gene-set analysis (GSA) methods use previously accumulated biological knowledge to group genes into sets and then aim to rank these gene sets in a way that reflects their relative importance in the experimental situation in question. We suspect that the presence of paralogs affects the ability of GSA methods to accurately identify the most important sets of genes for subsequent research. Results We show that paralogs, which typically have high sequence identity and similar molecular functions, also exhibit high correlation in their expression patterns. We investigate this correlation as a potential confounding factor common to current GSA methods using Indygene http:\\/\\/www.cbio.uct.ac.za\\/indygene, a web tool that reduces a supplied list of genes so that it includes no pairwise paralogy relationships above a specified sequence similarity threshold. We use the tool to reanalyse previously published microarray datasets and determine the potential utility of accounting for the presence of paralogs. Conclusions The Indygene tool efficiently removes paralogy relationships from a given dataset and we found that such a reduction, performed prior to GSA, has the ability to generate significantly different results that often represent novel and plausible biological hypotheses. This was demonstrated for three different GSA approaches when applied to the reanalysis of previously published microarray datasets and suggests that the redundancy and non-independence of paralogs is an important consideration when dealing with GSA methodologies.

  15. Comparative Investigation on Tool Wear during End Milling of AISI H13 Steel with Different Tool Path Strategies

    Science.gov (United States)

    Adesta, Erry Yulian T.; Riza, Muhammad; Avicena

    2018-03-01

    Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.

  16. Clinical code set engineering for reusing EHR data for research: A review.

    Science.gov (United States)

    Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels

    2017-06-01

    The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  18. Priority setting for health in emerging markets.

    Science.gov (United States)

    Glassman, Amanda; Giedion, Ursula; McQueston, Kate

    2013-05-01

    The use of health technology assessment research in emerging economies is becoming an increasingly important tool to determine the uses of health spending. As low- and middle-income countries' gross domestic product grows, the funding available for health has increased in tandem. There is growing evidence that comparative effectiveness research and cost-effectiveness can be used to improve health outcomes within a predefined financial space. The use of these evaluation tools, combined with a systematized process of priority setting, can help inform national and global health payers. This review of country institutions for health technology assessment illustrates two points: the efforts underway to use research to inform priorities are widespread and not confined to wealthier countries; and many countries' efforts to create evidence-based policy are incomplete and more country-specific research will be needed. Further evidence shows that there is scope to reduce these gaps and opportunity to support better incorporation of data through better-defined priority-setting processes.

  19. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  20. An Independent Filter for Gene Set Testing Based on Spectral Enrichment

    NARCIS (Netherlands)

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in

  1. Multidimensional Ranking: A New Transparency Tool for Higher Education and Research

    Science.gov (United States)

    van Vught, Frans; Westerheijden, Don F.

    2010-01-01

    This paper sets out to analyse the need for better "transparency tools" which inform university stakeholders about the quality of universities. First, we give an overview of what we understand by the concept of transparency tools and those that are currently available. We then critique current transparency tools' methodologies, looking in detail…

  2. Rough Standard Neutrosophic Sets: An Application on Standard Neutrosophic Information Systems

    Directory of Open Access Journals (Sweden)

    Nguyen Xuan Thao

    2016-12-01

    Full Text Available A rough fuzzy set is the result of the approximation of a fuzzy set with respect to a crisp approximation space. It is a mathematical tool for the knowledge discovery in the fuzzy information systems. In this paper, we introduce the concepts of rough standard neutrosophic sets and standard neutrosophic information system, and give some results of the knowledge discovery on standard neutrosophic information system based on rough standard neutrosophic sets.

  3. A Web-Based Validation Tool for GEWEX

    Science.gov (United States)

    Smith, R. A.; Gibson, S.; Heckert, E.; Minnis, P.; Sun-Mack, S.; Chen, Y.; Stubenrauch, C.; Kinne, S. A.; Ackerman, S. A.; Baum, B. A.; Chepfer, H.; Di Girolamo, L.; Heidinger, A. K.; Getzewich, B. J.; Guignard, A.; Maddux, B. C.; Menzel, W. P.; Platnick, S. E.; Poulsen, C.; Raschke, E. A.; Riedi, J.; Rossow, W. B.; Sayer, A. M.; Walther, A.; Winker, D. M.

    2011-12-01

    The Global Energy and Water Cycle Experiment (GEWEX) Cloud assessment was initiated by the GEWEX Radiation Panel (GRP) in 2005 to evaluate the variability of available, global, long-term cloud data products. Since then, eleven cloud data records have been established from various instruments, mostly onboard polar orbiting satellites. Cloud properties under study include cloud amount, cloud pressure, cloud temperature, cloud infrared (IR) emissivity and visible (VIS) optical thickness, cloud thermodynamic phase, as well as bulk microphysical properties. The volume of data and variations in parameters, spatial, and temporal resolution for the different datasets constitute a significant challenge for understanding the differences and the value of having more than one dataset. To address this issue, this paper presents a NASA Langley web-based tool to facilitate comparisons among the different cloud data sets. With this tool, the operator can choose to view numeric or graphic presentations to allow comparison between products. Multiple records are displayed in time series graphs, global maps, or zonal plots. The tool has been made flexible so that additional teams can easily add their data sets to the record selection list for use in their own analyses. This tool has possible applications to other climate and weather datasets.

  4. Framing quality improvement tools and techniques in healthcare the case of improvement leaders' guides.

    Science.gov (United States)

    Millar, Ross

    2013-01-01

    The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.

  5. User manual for storage simulation construction set

    International Nuclear Information System (INIS)

    Sehgal, Anil; Volz, Richard A.

    1999-01-01

    The Storage Simulation Set (SSCS) is a tool for composing storage system models using Telegrip. It is an application written in C++ and motif. With this system, the models of a storage system can be composed rapidly and accurately. The aspects of the SSCS are described within this report

  6. A MORET tool to assist code bias estimation

    International Nuclear Information System (INIS)

    Fernex, F.; Richet, Y.; Letang, E.

    2003-01-01

    This new Graphical User Interface (GUI) developed in JAVA is one of the post-processing tools for MORET4 code. It aims to help users to estimate the importance of the k eff bias due to the code in order to better define the upper safety limit. Moreover, it allows visualizing the distance between an actual configuration case and evaluated critical experiments. This tool depends on a validated experiments database, on sets of physical parameters and on various statistical tools allowing interpolating the calculation bias of the database or displaying the projections of experiments on a reduced base of parameters. The development of this tool is still in progress. (author)

  7. LHCb online infrastructure monitoring tools

    International Nuclear Information System (INIS)

    Granado Cardoso, L.; Gaspar, C.; Haen, C.; Neufeld, N.; Varela, F.; Galli, D.

    2012-01-01

    The Online System of the LHCb experiment at CERN is composed of a very large number of PCs: around 1500 in a CPU farm for performing the High Level Trigger; around 170 for the control system, running the SCADA system - PVSS; and several others for performing data monitoring, reconstruction, storage, and infrastructure tasks, like databases, etc. Some PCs run Linux, some run Windows but all of them need to be remotely controlled and monitored to make sure they are correctly running and to be able, for example, to reboot them whenever necessary. A set of tools was developed in order to centrally monitor the status of all PCs and PVSS Projects needed to run the experiment: a Farm Monitoring and Control (FMC) tool, which provides the lower level access to the PCs, and a System Overview Tool (developed within the Joint Controls Project - JCOP), which provides a centralized interface to the FMC tool and adds PVSS project monitoring and control. The implementation of these tools has provided a reliable and efficient way to manage the system, both during normal operations as well as during shutdowns, upgrades or maintenance operations. This paper will present the particular implementation of this tool in the LHCb experiment and the benefits of its usage in a large scale heterogeneous system

  8. GIS learning tool for world's largest earthquakes and their causes

    Science.gov (United States)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  9. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  10. ASCI visualization tool evaluation, Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, P. [ed.] [Sandia National Labs., Livermore, CA (United States). Center for Computational Engineering

    1997-04-01

    The charter of the ASCI Visualization Common Tools subgroup was to investigate and evaluate 3D scientific visualization tools. As part of that effort, a Tri-Lab evaluation effort was launched in February of 1996. The first step was to agree on a thoroughly documented list of 32 features against which all tool candidates would be evaluated. These evaluation criteria were both gleaned from a user survey and determined from informed extrapolation into the future, particularly as concerns the 3D nature and extremely large size of ASCI data sets. The second step was to winnow a field of 41 candidate tools down to 11. The selection principle was to be as inclusive as practical, retaining every tool that seemed to hold any promise of fulfilling all of ASCI`s visualization needs. These 11 tools were then closely investigated by volunteer evaluators distributed across LANL, LLNL, and SNL. This report contains the results of those evaluations, as well as a discussion of the evaluation philosophy and criteria.

  11. Mining Hierarchies and Similarity Clusters from Value Set Repositories.

    Science.gov (United States)

    Peterson, Kevin J; Jiang, Guoqian; Brue, Scott M; Shen, Feichen; Liu, Hongfang

    2017-01-01

    A value set is a collection of permissible values used to describe a specific conceptual domain for a given purpose. By helping to establish a shared semantic understanding across use cases, these artifacts are important enablers of interoperability and data standardization. As the size of repositories cataloging these value sets expand, knowledge management challenges become more pronounced. Specifically, discovering value sets applicable to a given use case may be challenging in a large repository. In this study, we describe methods to extract implicit relationships between value sets, and utilize these relationships to overlay organizational structure onto value set repositories. We successfully extract two different structurings, hierarchy and clustering, and show how tooling can leverage these structures to enable more effective value set discovery.

  12. Keeping you safe by making machine tools safe

    CERN Multimedia

    2012-01-01

    CERN’s third safety objective for 2012 concerns the safety of equipment - and machine tools in particular.   There are three prerequisites for ensuring that a machine tool can be used safely: ·      the machine tool must comply with Directive 2009/104/EC, ·      the layout of the workshop must be compliant, and ·      everyone who uses the machine tool must be trained. Provided these conditions are met, the workshop head can grant authorisation to use the machine tool. To fulfil this objective, an inventory of the machine tools must be drawn up and the people responsible for them identified. The HSE Unit's Safety Inspection Service produces compliance reports for the machine tools. In order to meet the third objective set by the Director-General, the section has doubled its capacity to carry out inspections: ...

  13. Novel Semi-Direct OH Reactivity (kOH) Measurements by Chemical Ionization Mass Spectrometry during a Chamber Instrument Comparison Campaign and Continuous Ambient Air Sampling at a Central European GAW Station

    Science.gov (United States)

    Muller, J.; Kubistin, D.; Elste, T.; Plass-Duelmer, C.; Claude, A.; Englert, J.; Holla, R.; Fuchs, H.; Hofzumahaus, A.; Holland, F.; Novelli, A.; Tillmann, R.; Wegener, R.; Rohrer, F.; Yu, Z.; Bohn, B.; Williams, J.; Pfannerstill, E.; Edtbauer, A.; Kluepfel, T.

    2016-12-01

    Total OH reactivity (kOH) has been recognized as a useful measure to gauge the potential atmospheric oxidation capacity and a few different in-situ measurement techniques have been developed over the last 15 years. Here results are presented from a novel semi-direct method developed by the German Weather Service (DWD) utilizing a chemical ionization mass spectrometer (CIMS). Recently in April 2016, the CIMS system participated in a half-blind kOH instrument comparison campaign at the Forschungszentrum Jülich (FZJ) SAPHIR chamber. Experiments provided controlled conditions with a range of different VOC mixtures and varying NOx levels, representing environments dominated by biogenic or urban emissions. Alongside CIMS, kOH was also measured by systems using the comparative reactivity method (CRM) and the pump-probe technique with OH detection. The intercomparison revealed a good performance of CIMS at lower OH reactivities (0-15 s-1), a range for which the instrumental set up was optimized. Limitations of the CIMS system consist of an upper limit for kOH detection and the need for applying a chemical correction function as a result of instrument-internal HOx recycling. Findings and instrument parameters obtained from the FZJ SAPHIR campaign and flow tube experiments are then applied to ambient air kOH measurements at the Meteorological Observatory Hohenpeissenberg (MOHp), Germany. The CIMS instrument is used there for long-term measurements of OH, H2SO4, ROx and kOH. Here, we show ambient air kOH measurements, interpreted in conjunction with volatile organic compounds (VOC) and inorganic trace gases also measured at the GAW station Hohenpeissenberg. These observations provide a unique dataset to investigate turnover rates and seasonal cycles of reactive trace gases, i.e. sources that make up total OH reactivity in this central European, rural setting.

  14. A browser-based 3D Visualization Tool designed for comparing CERES/CALIOP/CloudSAT level-2 data sets.

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Doelling, D. R.

    2017-12-01

    In Langley NASA, Clouds and the Earth's Radiant Energy System (CERES) and Moderate Resolution Imaging Spectroradiometer (MODIS) are merged with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat Cloud Profiling Radar (CPR). The CERES merged product (C3M) matches up to three CALIPSO footprints with each MODIS pixel along its ground track. It then assigns the nearest CloudSat footprint to each of those MODIS pixels. The cloud properties from MODIS, retrieved using the CERES algorithms, are included in C3M with the matched CALIPSO and CloudSat products along with radiances from 18 MODIS channels. The dataset is used to validate the CERES retrieved MODIS cloud properties and the computed TOA and surface flux difference using MODIS or CALIOP/CloudSAT retrieved clouds. This information is then used to tune the computed fluxes to match the CERES observed TOA flux. A visualization tool will be invaluable to determine the cause of these large cloud and flux differences in order to improve the methodology. This effort is part of larger effort to allow users to order the CERES C3M product sub-setted by time and parameter as well as the previously mentioned visualization capabilities. This presentation will show a new graphical 3D-interface, 3D-CERESVis, that allows users to view both passive remote sensing satellites (MODIS and CERES) and active satellites (CALIPSO and CloudSat), such that the detailed vertical structures of cloud properties from CALIPSO and CloudSat are displayed side by side with horizontally retrieved cloud properties from MODIS and CERES. Similarly, the CERES computed profile fluxes whether using MODIS or CALIPSO and CloudSat clouds can also be compared. 3D-CERESVis is a browser-based visualization tool that makes uses of techniques such as multiple synchronized cursors, COLLADA format data and Cesium.

  15. PIIKA 2: an expanded, web-based platform for analysis of kinome microarray data.

    Directory of Open Access Journals (Sweden)

    Brett Trost

    Full Text Available Kinome microarrays are comprised of peptides that act as phosphorylation targets for protein kinases. This platform is growing in popularity due to its ability to measure phosphorylation-mediated cellular signaling in a high-throughput manner. While software for analyzing data from DNA microarrays has also been used for kinome arrays, differences between the two technologies and associated biologies previously led us to develop Platform for Intelligent, Integrated Kinome Analysis (PIIKA, a software tool customized for the analysis of data from kinome arrays. Here, we report the development of PIIKA 2, a significantly improved version with new features and improvements in the areas of clustering, statistical analysis, and data visualization. Among other additions to the original PIIKA, PIIKA 2 now allows the user to: evaluate statistically how well groups of samples cluster together; identify sets of peptides that have consistent phosphorylation patterns among groups of samples; perform hierarchical clustering analysis with bootstrapping; view false negative probabilities and positive and negative predictive values for t-tests between pairs of samples; easily assess experimental reproducibility; and visualize the data using volcano plots, scatterplots, and interactive three-dimensional principal component analyses. Also new in PIIKA 2 is a web-based interface, which allows users unfamiliar with command-line tools to easily provide input and download the results. Collectively, the additions and improvements described here enhance both the breadth and depth of analyses available, simplify the user interface, and make the software an even more valuable tool for the analysis of kinome microarray data. Both the web-based and stand-alone versions of PIIKA 2 can be accessed via http://saphire.usask.ca.

  16. PIIKA 2: an expanded, web-based platform for analysis of kinome microarray data.

    Science.gov (United States)

    Trost, Brett; Kindrachuk, Jason; Määttänen, Pekka; Napper, Scott; Kusalik, Anthony

    2013-01-01

    Kinome microarrays are comprised of peptides that act as phosphorylation targets for protein kinases. This platform is growing in popularity due to its ability to measure phosphorylation-mediated cellular signaling in a high-throughput manner. While software for analyzing data from DNA microarrays has also been used for kinome arrays, differences between the two technologies and associated biologies previously led us to develop Platform for Intelligent, Integrated Kinome Analysis (PIIKA), a software tool customized for the analysis of data from kinome arrays. Here, we report the development of PIIKA 2, a significantly improved version with new features and improvements in the areas of clustering, statistical analysis, and data visualization. Among other additions to the original PIIKA, PIIKA 2 now allows the user to: evaluate statistically how well groups of samples cluster together; identify sets of peptides that have consistent phosphorylation patterns among groups of samples; perform hierarchical clustering analysis with bootstrapping; view false negative probabilities and positive and negative predictive values for t-tests between pairs of samples; easily assess experimental reproducibility; and visualize the data using volcano plots, scatterplots, and interactive three-dimensional principal component analyses. Also new in PIIKA 2 is a web-based interface, which allows users unfamiliar with command-line tools to easily provide input and download the results. Collectively, the additions and improvements described here enhance both the breadth and depth of analyses available, simplify the user interface, and make the software an even more valuable tool for the analysis of kinome microarray data. Both the web-based and stand-alone versions of PIIKA 2 can be accessed via http://saphire.usask.ca.

  17. FACT: taking a spiritual history in a clinical setting.

    Science.gov (United States)

    Larocca-Pitts, Mark A

    2008-01-01

    Healthcare clinicians need a good tool for taking spiritual histories in a clinical setting. A spiritual history provides important clinical information and any properly trained clinician can take one. Professionally trained chaplains follow-up with more in-depth spiritual assessments if indicated. A spiritual history tool's effectiveness depends on five criteria: brevity, memorability, appropriateness, patient-centeredness, and credibility (Koenig, 2007). The chaplain-developed FACT stands for: F-Faith (and/or Belief); A-Active (and/or Available, Accessible, Applicable); C-Coping (and/or Comfort)/Conflict (and/or Concern); and T-Treatment. FACT compares favorably, if not better in some categories, with three physician-developed spiritual history tools: Koenig's (2007) CSI-MEMO, American College of Physicians' tool (Lo, Quill, & Tulsky, 1999), and Puchalski's and Romer's (2000) FICA.

  18. Polymorphism and Module-Reuse Mechanisms for Algebraic Petri Nets in CoopnTools

    OpenAIRE

    Buffo, Mathieu; Buchs, Didier; Donatelli, S.; Kleijn, J.

    1999-01-01

    This paper introduces CoopnTools, a tool set allowing the support of object-oriented specifications written by means of the language CO-OPN/2, based on synchronised algebraic Petri nets. In particular, this paper shows how concrete mechanisms dealing with polymorphism and module-reuse are implemented in CoopnTools.

  19. Evaluation of Modern Tools of Controlling at the Enterprise

    Directory of Open Access Journals (Sweden)

    Prokopenko Olga V.

    2016-11-01

    Full Text Available Theoretical and practical aspects of the use of tools for operational and strategic controlling at the enterprise are studied. There carried out a detailed analysis of modern tools of controlling with the subsequent identification of operational and strategic controlling tools that are most adapted for the application at domestic enterprises. Furthermore, the study highlights the advantages and disadvantages of the proposed tools for managers to choose the most effective ones for the implementation of operational and strategic controlling. For each enterprise managers should form its own set of controlling tools considering its individual characteristics and existing problems. Prospects for further research are practical application of the recommended tools and implementation of further analysis of the obtained results in terms of the effect from the introduction of specific controlling tools.

  20. Wilmar Planning Tool, VBA documentation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  1. Wilmar Planning Tool, VBA documentation

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  2. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  3. GameTeen: new tools for evaluating and training emotional regulation strategies

    OpenAIRE

    Rodriguez Ortega, Alejandro; Rey, Beatriz; Alcañiz Raya, Mariano Luis; BAÑOS, R.; Guixeres Provinciale, Jaime; Wrzesien, Maja; Gómez Martínez, Mario; Pérez Lopez, David Clemente; Rasal, Paloma; Parra Vargas, Elena

    2012-01-01

    The aim of this paper is to describe GameTeen, a novel instrument for the assessment and training of Emotional Regulation (ER) strategies in adolescent population. These new tools are based on the use of 3D serious games that can be played under different settings. The evolution of ER strategies will be monitored in two ways depending on the setting where the tool is presented. Firstly, in the laboratory, physiological signals and facial expressions of participants will be recorded. Secondly,...

  4. Contingency diagrams as teaching tools

    OpenAIRE

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  5. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    Funkhouser, B.R.; Moyer, J.W.; Ballweg, E.L.

    1994-01-01

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  6. Precision tool holder with flexure-adjustable, three degrees of freedom for a four-axis lathe

    Science.gov (United States)

    Bono, Matthew J [Pleasanton, CA; Hibbard, Robin L [Livermore, CA

    2008-03-04

    A precision tool holder for precisely positioning a single point cutting tool on 4-axis lathe, such that the center of the radius of the tool nose is aligned with the B-axis of the machine tool, so as to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-.mu.m accuracy on a four-axis lathe. The device is designed to fit on a commercial diamond turning machine and can adjust the cutting tool position in three orthogonal directions with sub-micrometer resolution. In particular, the tool holder adjusts the tool position using three flexure-based mechanisms, with two flexure mechanisms adjusting the lateral position of the tool to align the tool with the B-axis, and a third flexure mechanism adjusting the height of the tool. Preferably, the flexures are driven by manual micrometer adjusters. In this manner, this tool holder simplifies the process of setting a tool with sub-.mu.m accuracy, to substantially reduce the time required to set the tool.

  7. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  8. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London.

    Science.gov (United States)

    Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek

    2017-09-24

    Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with

  9. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  10. The tools of mathematical reasoning

    CERN Document Server

    Lakins, Tamara J

    2016-01-01

    This accessible textbook gives beginning undergraduate mathematics students a first exposure to introductory logic, proofs, sets, functions, number theory, relations, finite and infinite sets, and the foundations of analysis. The book provides students with a quick path to writing proofs and a practical collection of tools that they can use in later mathematics courses such as abstract algebra and analysis. The importance of the logical structure of a mathematical statement as a framework for finding a proof of that statement, and the proper use of variables, is an early and consistent theme used throughout the book.

  11. Tools for Local and Distributed Climate Data Access

    Science.gov (United States)

    Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.

    2017-12-01

    Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features

  12. Setting UP a decontamination and dismantling (D and D) scenario - methodology and tools developed leopard

    International Nuclear Information System (INIS)

    Pradoura, F.

    2009-01-01

    At the AREVA NC La Hague site, the former nuclear spent fuel reprocessing plant UP2-400 was shutdown on December 30, 2003. Since then, the cleaning up and dismantling activities have been carried by the DV/PRO project, which is the program management organization settled by AREVA NC, for valorization projects. SGN, part of the AREVA NC Engineering Business Unit, operates as the main contractor of the DV/PRO project and provides project management services related to decommissioning and waste management. Hence, SGN is in charge of building D and D's scenarios for all the facilities of the UP2-400 plant, in compliance with safety, technical and financial requirements. Main outputs are logic diagrams, block flow diagrams, wastes and effluents throughputs. In order to meet with AREVA NC's requirements and expectations, SGN developed specific process and tools methods adapted to the scale and complexity of decommissioning a plant with several facilities, with different kind of processes (chemical, mechanical), some of which are in operation and other being dismantled. Considering the number of technical data and inputs to be managed, this methodology leads to complex outputs such as schedules, throughputs, work packages... The development, the maintenance and the modification of these outputs become more and more difficult with the complexity and the size of the plant considered. To cope with these issues, SGN CDE/DEM UP2-400 project team has developed a dedicated tool to assist and optimize in elaborating D and D scenarios. This tool is named LEOPARD (Logiciel d'Elaboration et d'Optimisation des Programmes d'Assainissement Radiologique et de Demantelement) (Software for the Development and Optimization of Radiological Clean up and Dismantling Programs). The availability of this tool allowed the rapid construction of a test case (demonstrator) that has convinced DV/PRO of its numerous advantages and of the future further development potentials. Presentations of LEOPARD

  13. Nursing Minimum Data Set Based on EHR Archetypes Approach.

    Science.gov (United States)

    Spigolon, Dandara N; Moro, Cláudia M C

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems.

  14. Goal setting: an integral component of effective diabetes care.

    Science.gov (United States)

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  15. A Text Matching Method to Facilitate the Validation of Frequent Order Sets Obtained Through Data Mining

    OpenAIRE

    Che, Chengjian; Rocha, Roberto A.

    2006-01-01

    In order to compare order sets discovered using a data mining algorithm with existing order sets, we developed an order matching tool based on Oracle Text. The tool includes both automated searching and manual review processes. The comparison between the automated process and the manual review process indicates that the sensitivity of the automated matching is 81% and the specificity is 84%.

  16. Ocular Fundus Photography as an Educational Tool.

    Science.gov (United States)

    Mackay, Devin D; Garza, Philip S

    2015-10-01

    The proficiency of nonophthalmologists with direct ophthalmoscopy is poor, which has prompted a search for alternative technologies to examine the ocular fundus. Although ocular fundus photography has existed for decades, its use has been traditionally restricted to ophthalmology clinical care settings and textbooks. Recent research has shown a role for nonmydriatic fundus photography in nonophthalmic settings, encouraging more widespread adoption of fundus photography technology. Recent studies have also affirmed the role of fundus photography as an adjunct or alternative to direct ophthalmoscopy in undergraduate medical education. In this review, the authors examine the use of ocular fundus photography as an educational tool and suggest future applications for this important technology. Novel applications of fundus photography as an educational tool have the potential to resurrect the dying art of funduscopy. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  18. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  19. Tools for signal compression applications to speech and audio coding

    CERN Document Server

    Moreau, Nicolas

    2013-01-01

    This book presents tools and algorithms required to compress/uncompress signals such as speech and music. These algorithms are largely used in mobile phones, DVD players, HDTV sets, etc. In a first rather theoretical part, this book presents the standard tools used in compression systems: scalar and vector quantization, predictive quantization, transform quantization, entropy coding. In particular we show the consistency between these different tools. The second part explains how these tools are used in the latest speech and audio coders. The third part gives Matlab programs simulating t

  20. Setting clear expectations for safety basis development

    International Nuclear Information System (INIS)

    MORENO, M.R.

    2003-01-01

    DOE-RL has set clear expectations for a cost-effective approach for achieving compliance with the Nuclear Safety Management requirements (10 CFR 830, Nuclear Safety Rule) which will ensure long-term benefit to Hanford. To facilitate implementation of these expectations, tools were developed to streamline and standardize safety analysis and safety document development resulting in a shorter and more predictable DOE approval cycle. A Hanford Safety Analysis and Risk Assessment Handbook (SARAH) was issued to standardized methodologies for development of safety analyses. A Microsoft Excel spreadsheet (RADIDOSE) was issued for the evaluation of radiological consequences for accident scenarios often postulated for Hanford. A standard Site Documented Safety Analysis (DSA) detailing the safety management programs was issued for use as a means of compliance with a majority of 3009 Standard chapters. An in-process review was developed between DOE and the Contractor to facilitate DOE approval and provide early course correction. As a result of setting expectations and providing safety analysis tools, the four Hanford Site waste management nuclear facilities were able to integrate into one Master Waste Management Documented Safety Analysis (WM-DSA)

  1. Developing a free and easy to use digital goal setting tool for busy mums

    Directory of Open Access Journals (Sweden)

    Babs Evans

    2015-09-01

    Using data, research and the expertise of commercial and charity partners was an effective way to design a digital product to support behavioural change. By understanding the target audience from the beginning and involving them in the planning stages, the organisations were able to develop a tool the users want with a strong focus on user experience.

  2. Selecting a risk-based tool to aid in decision making

    Energy Technology Data Exchange (ETDEWEB)

    Bendure, A.O.

    1995-03-01

    Selecting a risk-based tool to aid in decision making is as much of a challenge as properly using the tool once it has been selected. Failure to consider customer and stakeholder requirements and the technical bases and differences in risk-based decision making tools will produce confounding and/or politically unacceptable results when the tool is used. Selecting a risk-based decisionmaking tool must therefore be undertaken with the same, if not greater, rigor than the use of the tool once it is selected. This paper presents a process for selecting a risk-based tool appropriate to a set of prioritization or resource allocation tasks, discusses the results of applying the process to four risk-based decision-making tools, and identifies the ``musts`` for successful selection and implementation of a risk-based tool to aid in decision making.

  3. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  4. Forest Landscape Assessment Tool (FLAT): rapid assessment for land management

    Science.gov (United States)

    Lisa Ciecko; David Kimmett; Jesse Saunders; Rachael Katz; Kathleen L. Wolf; Oliver Bazinet; Jeffrey Richardson; Weston Brinkley; Dale J. Blahna

    2016-01-01

    The Forest Landscape Assessment Tool (FLAT) is a set of procedures and tools used to rapidly determine forest ecological conditions and potential threats. FLAT enables planners and managers to understand baseline conditions, determine and prioritize restoration needs across a landscape system, and conduct ongoing monitoring to achieve land management goals. The rapid...

  5. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  6. Occupational health management: an audit tool.

    Science.gov (United States)

    Shelmerdine, L; Williams, N

    2003-03-01

    Organizations must manage occupational health risks in the workplace and the UK Health & Safety Executive (HSE) has published guidance on successful health and safety management. This paper describes a method of using the published guidance to audit the management of occupational health and safety, first at an organizational level and, secondly, to audit an occupational health service provider's role in the management of health risks. The paper outlines the legal framework in the UK for health risk management and describes the development and use of a tool for qualitative auditing of the efficiency, effectiveness and reliability of occupational health service provision within an organization. The audit tool is presented as a question set and the paper concludes with discussion of the strengths and weaknesses of using this tool, and recommendations on its use.

  7. Patient-Centered Tools for Medication Information Search.

    Science.gov (United States)

    Wilcox, Lauren; Feiner, Steven; Elhadad, Noémie; Vawdrey, David; Tran, Tran H

    2014-05-20

    Recent research focused on online health information seeking highlights a heavy reliance on general-purpose search engines. However, current general-purpose search interfaces do not necessarily provide adequate support for non-experts in identifying suitable sources of health information. Popular search engines have recently introduced search tools in their user interfaces for a range of topics. In this work, we explore how such tools can support non-expert, patient-centered health information search. Scoping the current work to medication-related search, we report on findings from a formative study focused on the design of patient-centered, medication-information search tools. Our study included qualitative interviews with patients, family members, and domain experts, as well as observations of their use of Remedy, a technology probe embodying a set of search tools. Post-operative cardiothoracic surgery patients and their visiting family members used the tools to find information about their hospital medications and were interviewed before and after their use. Domain experts conducted similar search tasks and provided qualitative feedback on their preferences and recommendations for designing these tools. Findings from our study suggest the importance of four valuation principles underlying our tools: credibility, readability, consumer perspective, and topical relevance.

  8. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  9. Magic, science and masculinity: marketing toy chemistry sets.

    Science.gov (United States)

    Al-Gailani, Salim

    2009-12-01

    At least since the late nineteenth century, toy chemistry sets have featured in standard scripts of the achievement of eminence in science, and they remain important in constructions of scientific identity. Using a selection of these toys manufactured in Britain and the United States, and with particular reference to the two dominant American brands, Gilbert and Chemcraft, this paper suggests that early twentieth-century chemistry sets were rooted in overlapping Victorian traditions of entertainment magic and scientific recreations. As chemistry set marketing copy gradually reoriented towards emphasising scientific modernity, citizenship, discipline and educational value, pre-twentieth-century traditions were subsumed within domestic-and specifically masculine-tropes. These developments in branding strategies point to transformations in both users' engagement with their chemistry sets and the role of scientific toys in domestic play. The chemistry set serves here as a useful tool for measuring cultural change and lay engagement with chemistry.

  10. Wilmar Planning Tool, user guide

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  11. Wilmar Planning Tool, user guide

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  12. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.

    Science.gov (United States)

    Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.

  13. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis

    Directory of Open Access Journals (Sweden)

    Mathieu Bergeron

    2015-01-01

    Full Text Available Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients’ symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points, changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.

  14. Calculation of Coupled Vibroacoustics Response Estimates from a Library of Available Uncoupled Transfer Function Sets

    Science.gov (United States)

    Smith, Andrew; LaVerde, Bruce; Hunt, Ron; Fulcher, Clay; Towner, Robert; McDonald, Emmett

    2012-01-01

    The design and theoretical basis of a new database tool that quickly generates vibroacoustic response estimates using a library of transfer functions (TFs) is discussed. During the early stages of a launch vehicle development program, these response estimates can be used to provide vibration environment specification to hardware vendors. The tool accesses TFs from a database, combines the TFs, and multiplies these by input excitations to estimate vibration responses. The database is populated with two sets of uncoupled TFs; the first set representing vibration response of a bare panel, designated as H(sup s), and the second set representing the response of the free-free component equipment by itself, designated as H(sup c). For a particular configuration undergoing analysis, the appropriate H(sup s) and H(sup c) are selected and coupled to generate an integrated TF, designated as H(sup s +c). This integrated TF is then used with the appropriate input excitations to estimate vibration responses. This simple yet powerful tool enables a user to estimate vibration responses without directly using finite element models, so long as suitable H(sup s) and H(sup c) sets are defined in the database libraries. The paper discusses the preparation of the database tool and provides the assumptions and methodologies necessary to combine H(sup s) and H(sup c) sets into an integrated H(sup s + c). An experimental validation of the approach is also presented.

  15. Targets as a tool in health policy. Part I: Lessons learned

    NARCIS (Netherlands)

    van Herten, L. M.; Gunning-Schepers, L. J.

    2000-01-01

    This article reviews the start of the use of targets as a tool in health policy, summarises the fruitful uses and frequently-heard objections, and proposes some conditions for successful health target setting. Targets as tool in health policy are based on the 'management by objectives' approach

  16. Effects of dividing attention on memory for declarative and procedural aspects of tool use.

    Science.gov (United States)

    Roy, Shumita; Park, Norman W

    2016-07-01

    Tool-related knowledge and skills are supported by a complex set of memory processes that are not well understood. Some aspects of tools are mediated by either declarative or procedural memory, while other aspects may rely on an interaction of both systems. Although motor skill learning is believed to be primarily supported by procedural memory, there is debate in the current literature regarding the role of declarative memory. Growing evidence suggests that declarative memory may be involved during early stages of motor skill learning, although findings have been mixed. In the current experiment, healthy, younger adults were trained to use a set of novel complex tools and were tested on their memory for various aspects of the tools. Declarative memory encoding was interrupted by dividing attention during training. Findings showed that dividing attention during training was detrimental for subsequent memory for tool attributes as well as accurate demonstration of tool use and tool grasping. However, dividing attention did not interfere with motor skill learning, suggesting that declarative memory is not essential for skill learning associated with tools.

  17. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  18. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  19. A course on Borel sets

    CERN Document Server

    Srivastava, S M

    1998-01-01

    The roots of Borel sets go back to the work of Baire [8]. He was trying to come to grips with the abstract notion of a function introduced by Dirich­ let and Riemann. According to them, a function was to be an arbitrary correspondence between objects without giving any method or procedure by which the correspondence could be established. Since all the specific functions that one studied were determined by simple analytic expressions, Baire delineated those functions that can be constructed starting from con­ tinuous functions and iterating the operation 0/ pointwise limit on a se­ quence 0/ functions. These functions are now known as Baire functions. Lebesgue [65] and Borel [19] continued this work. In [19], Borel sets were defined for the first time. In his paper, Lebesgue made a systematic study of Baire functions and introduced many tools and techniques that are used even today. Among other results, he showed that Borel functions coincide with Baire functions. The study of Borel sets got an impetus from...

  20. Motif enrichment tool.

    Science.gov (United States)

    Blatti, Charles; Sinha, Saurabh

    2014-07-01

    The Motif Enrichment Tool (MET) provides an online interface that enables users to find major transcriptional regulators of their gene sets of interest. MET searches the appropriate regulatory region around each gene and identifies which transcription factor DNA-binding specificities (motifs) are statistically overrepresented. Motif enrichment analysis is currently available for many metazoan species including human, mouse, fruit fly, planaria and flowering plants. MET also leverages high-throughput experimental data such as ChIP-seq and DNase-seq from ENCODE and ModENCODE to identify the regulatory targets of a transcription factor with greater precision. The results from MET are produced in real time and are linked to a genome browser for easy follow-up analysis. Use of the web tool is free and open to all, and there is no login requirement. ADDRESS: http://veda.cs.uiuc.edu/MET/. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Severe accident management guidelines tool

    International Nuclear Information System (INIS)

    Gutierrez Varela, Javier; Tanarro Onrubia, Augustin; Martinez Fanegas, Rafael

    2014-01-01

    Severe Accident is addressed by means of a great number of documents such as guidelines, calculation aids and diagnostic trees. The response methodology often requires the use of several documents at the same time while Technical Support Centre members need to assess the appropriate set of equipment within the adequate mitigation strategies. In order to facilitate the response, TECNATOM has developed SAMG TOOL, initially named GGAS TOOL, which is an easy to use computer program that clearly improves and accelerates the severe accident management. The software is designed with powerful features that allow the users to focus on the decision-making process. Consequently, SAMG TOOL significantly improves the severe accident training, ensuring a better response under a real situation. The software is already installed in several Spanish Nuclear Power Plants and trainees claim that the methodology can be followed easier with it, especially because guidelines, calculation aids, equipment information and strategies availability can be accessed immediately (authors)

  2. Validity and Reliability of Persian Version of Johns Hopkins Fall Risk Assessment Tool among Aged People

    Directory of Open Access Journals (Sweden)

    hadi hojati

    2018-04-01

    Full Text Available Background & Aim: It is crucial to identify aged patients in risk of falls in clinical settings. Johns Hopkins Fall Risk Assessment Tool (JHFRAT is one of most applied international instrument to assess elderly patients for the risk of falls. The aim of this study was to evaluate reliability and internal consistency of the JHFRAT. Methods & Materials: In this cross-sectional study for validity assessment of the tool, WHO’s standard protocol was applied for translation-back translation of the tool. Face and content validity of the tool was confirmed by ten person of expert faculty members for its applicability in clinical setting. In this pilot study, the inclusion criteria were being 60 or more years old, hospitalized in the last 8 hours prior to assessment and in proper cognitive condition assessed by MMSE. Subjects of the study were (n=70 elderly patients who were newly hospitalized in Shahroud Emam Hossein Hospital. Data were analyzed using SPSS software- version 16. Internal consistency of the tool was calculated by Cronbach’s alpha. Results: According to the results of the study Persian version of JHFRAT was a valid tool for application on clinical setting. The Persian version of the tool had Cronbach’s alpha equal to 0/733. Conclusion: Based on the findings of the current study, it can be concluded that Persian version of the JHFRAT is a valid and reliable tool to be applied for assessment of elderly senior citizens on admission in any clinical settings.

  3. Tools and Techniques for Basin-Scale Climate Change Assessment

    Science.gov (United States)

    Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.

    2012-12-01

    The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other

  4. Accuracy in tangential breast treatment set-up

    International Nuclear Information System (INIS)

    Tienhoven, G. van; Lanson, J.H.; Crabeels, D.; Heukelom, S.; Mijnheer, B.J.

    1991-01-01

    To test accuracy and reproducibility of tangential breast treatment set-up used in The Netherlands Cancer Institute, a portal imaging study was performed in 12 patients treated for early stage breast cancer. With an on-line electronic portal imaging device (EPID) images were obtained of each patient in several fractions and compared with simulator films and with each other. In 5 patients multiple images (on the average 7) per fraction were obtained to evaluate set-up variations due to respiratory movement. The central lung distance (CLD) and other set-up parameters varied within 1 fraction about 1mm (1SD). The average variation of these parameters between various fractions was about 2 mm (1SD). The differences between simulator and treatment set-up over all patients and all fractions was on the average 2-3mm for the central beam edge to skin distance and CLD. It can be concluded that the tangential breast treatment set-up is very stable and reproducible and that respiration does not have a significant influence on treatment volume. EPID appears to be an adequate tool for studies of treatment set-up accuracy like this. (author). 35 refs.; 2 figs.; 3 tabs

  5. Electron beam weld parameter set development and cavity cost

    International Nuclear Information System (INIS)

    John Brawley; John Mammossor; Larry Philips

    1997-01-01

    Various methods have recently been considered for use in the cost-effective manufacturing of large numbers of niobium cavities. A method commonly assumed to be too expensive is the joining of half cells by electron beam welding (EBW), as has been done with multipurpose EBW equipment for producing small numbers of cavities at accelerator laboratories. The authors have begun to investigate the advantages that would be available if a single-purpose, task-specific EBW processing tool were used to produce cavities in a high-volume commercial-industrial context. For such a tool and context they have sought to define an EBW parameter set that is cost-effective not only in terms of per-cavity production cost, but also in terms of the minimization of quench-producing weld defects. That is, they define cavity cost-effectiveness to include both production and performance costs. For such an EBW parameter set, they have developed a set of ideal characteristics, produced and tested samples and a complete cavity, studied the weld-defect question, and obtained industrial estimates of cavity high-volume production costs. The investigation in ongoing. This paper reports preliminary findings

  6. Using Multiattribute Utility Theory as a Priority-Setting Tool in Human Services Planning.

    Science.gov (United States)

    Camasso, Michael J.; Dick, Janet

    1993-01-01

    The feasibility of applying multiattribute utility theory to the needs assessment and priority-setting activities of human services planning councils was studied in Essex County (New Jersey). Decision-making and information filtering processes are explored in the context of community planning. (SLD)

  7. Hypergraphs combinatorics of finite sets

    CERN Document Server

    Berge, C

    1989-01-01

    Graph Theory has proved to be an extremely useful tool for solving combinatorial problems in such diverse areas as Geometry, Algebra, Number Theory, Topology, Operations Research and Optimization. It is natural to attempt to generalise the concept of a graph, in order to attack additional combinatorial problems. The idea of looking at a family of sets from this standpoint took shape around 1960. In regarding each set as a ``generalised edge'' and in calling the family itself a ``hypergraph'', the initial idea was to try to extend certain classical results of Graph Theory such as the theorems of Turán and König. It was noticed that this generalisation often led to simplification; moreover, one single statement, sometimes remarkably simple, could unify several theorems on graphs. This book presents what seems to be the most significant work on hypergraphs.

  8. Use of a computerized medication shared decision making tool in community mental health settings: impact on psychotropic medication adherence.

    Science.gov (United States)

    Stein, Bradley D; Kogan, Jane N; Mihalyo, Mark J; Schuster, James; Deegan, Patricia E; Sorbero, Mark J; Drake, Robert E

    2013-04-01

    Healthcare reform emphasizes patient-centered care and shared decision-making. This study examined the impact on psychotropic adherence of a decision support center and computerized tool designed to empower and activate consumers prior to an outpatient medication management visit. Administrative data were used to identify 1,122 Medicaid-enrolled adults receiving psychotropic medication from community mental health centers over a two-year period from community mental health centers. Multivariate linear regression models were used to examine if tool users had higher rates of 180-day medication adherence than non-users. Older clients, Caucasian clients, those without recent hospitalizations, and those who were Medicaid-eligible due to disability had higher rates of 180-day medication adherence. After controlling for sociodemographics, clinical characteristics, baseline adherence, and secular changes over time, using the computerized tool did not affect adherence to psychotropic medications. The computerized decision tool did not affect medication adherence among clients in outpatient mental health clinics. Additional research should clarify the impact of decision-making tools on other important outcomes such as engagement, patient-prescriber communication, quality of care, self-management, and long-term clinical and functional outcomes.

  9. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  10. Decision-support tools for climate change mitigation planning

    DEFF Research Database (Denmark)

    Puig, Daniel; Aparcana Robles, Sandra Roxana

    . For example, in the case of life-cycle analysis, the evaluation criterion entails that the impacts of interest are examined across the entire life-cycle of the product under study, from extraction of raw materials, to product disposal. Effectively, then, the choice of decision-support tool directs......This document describes three decision-support tools that can aid the process of planning climate change mitigation actions. The phrase ‘decision-support tools’ refers to science-based analytical procedures that facilitate the evaluation of planning options (individually or compared to alternative...... options) against a particular evaluation criterion or set of criteria. Most often decision-support tools are applied with the help of purpose-designed software packages and drawing on specialised databases.The evaluation criteria alluded to above define and characterise each decision-support tool...

  11. Peak Wind Tool for General Forecasting

    Science.gov (United States)

    Barrett, Joe H., III

    2010-01-01

    The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded

  12. ANALYSIS OF FORMING TREAD WHEEL SETS

    Directory of Open Access Journals (Sweden)

    Igor IVANOV

    2017-09-01

    Full Text Available This paper shows the results of a theoretical study of profile high-speed grinding (PHSG for forming tread wheel sets during repair instead of turning and mold-milling. Significant disadvantages of these methods are low capacity to adapt to the tool and inhomogeneous structure of the wheel material. This leads to understated treatment regimens and difficulties in recovering wheel sets with thermal and mechanical defects. This study carried out modeling and analysis of emerging cutting forces. Proposed algorithms describe the random occurrence of the components of the cutting forces in the restoration profile of wheel sets with an inhomogeneous structure of the material. To identify the statistical features of randomly generated structures fractal dimension and the method of random additions were used. The multifractal spectrum formed is decomposed into monofractals by wavelet transform. The proposed method allows you to create the preconditions for controlling the parameters of the treatment process.

  13. The use and impact of quality of life assessment tools in clinical care settings for cancer patients, with a particular emphasis on brain cancer: insights from a systematic review and stakeholder consultations.

    Science.gov (United States)

    King, Sarah; Exley, Josephine; Parks, Sarah; Ball, Sarah; Bienkowska-Gibbs, Teresa; MacLure, Calum; Harte, Emma; Stewart, Katherine; Larkin, Jody; Bottomley, Andrew; Marjanovic, Sonja

    2016-09-01

    Patient-reported data are playing an increasing role in health care. In oncology, data from quality of life (QoL) assessment tools may be particularly important for those with limited survival prospects, where treatments aim to prolong survival while maintaining or improving QoL. This paper examines the use and impact of using QoL measures on health care of cancer patients within a clinical setting, particularly those with brain cancer. It also examines facilitators and challenges, and provides implications for policy and practice. We conducted a systematic literature review, 15 expert interviews and a consultation at an international summit. The systematic review found no relevant intervention studies specifically in brain cancer patients, and after expanding our search to include other cancers, 15 relevant studies were identified. The evidence on the effectiveness of using QoL tools was inconsistent for patient management, but somewhat more consistent in favour of improving patient-physician communication. Interviews identified unharnessed potential and growing interest in QoL tool use and associated challenges to address. Our findings suggest that the use of QoL tools in cancer patients may improve patient-physician communication and have the potential to improve care, but the tools are not currently widely used in clinical practice (in brain cancer nor some other cancer contexts) although they are in clinical trials. There is a need for further research and stakeholder engagement on how QoL tools can achieve most impact across cancer and patient contexts. There is also a need for policy, health professional, research and patient communities to strengthen information exchange and debate, support awareness raising and provide training on tool design, use and interpretation.

  14. Predicting the Abrasion Resistance of Tool Steels by Means of Neurofuzzy Model

    Directory of Open Access Journals (Sweden)

    Dragutin Lisjak

    2013-07-01

    Full Text Available This work considers use neurofuzzy set theory for estimate abrasion wear resistance of steels based on chemical composition, heat treatment (austenitising temperature, quenchant and tempering temperature, hardness after hardening and different tempering temperature and volume loss of materials according to ASTM G 65-94. Testing of volume loss for the following group of materials as fuzzy data set was taken: carbon tool steels, cold work tool steels, hot work tools steels, high-speed steels. Modelled adaptive neuro fuzzy inference system (ANFIS is compared to statistical model of multivariable non-linear regression (MNLR. From the results it could be concluded that it is possible well estimate abrasion wear resistance for steel whose volume loss is unknown and thus eliminate unnecessary testing.

  15. Health impact assessment in planning: Development of the design for health HIA tools

    International Nuclear Information System (INIS)

    Forsyth, Ann; Slotterback, Carissa Schively; Krizek, Kevin J.

    2010-01-01

    How can planners more systematically incorporate health concerns into practical planning processes? This paper describes a suite of health impact assessment tools (HIAs) developed specifically for planning practice. Taking an evidence-based approach the tools are designed to fit into existing planning activities. The tools include: a short audit tool, the Preliminary Checklist; a structured participatory workshop, the Rapid HIA; an intermediate health impact assessment, the Threshold Analysis; and a set of Plan Review Checklists. This description provides a basis for future work including assessing tool validity, refining specific tools, and creating alternatives.

  16. Predictive validity of the identification of seniors at risk screening tool in a German emergency department setting.

    Science.gov (United States)

    Singler, Katrin; Heppner, Hans Jürgen; Skutetzky, Andreas; Sieber, Cornel; Christ, Michael; Thiem, Ulrich

    2014-01-01

    The identification of patients at high risk for adverse outcomes [death, unplanned readmission to emergency department (ED)/hospital, functional decline] plays an important role in emergency medicine. The Identification of Seniors at Risk (ISAR) instrument is one of the most commonly used and best-validated screening tools. As to the authors' knowledge so far there are no data on any screening tool for the identification of older patients at risk for a negative outcome in Germany. To evaluate the validity of the ISAR screening tool in a German ED. This was a prospective single-center observational cohort study in an ED of an urban university-affiliated hospital. Participants were 520 patients aged ≥75 years consecutively admitted to the ED. The German version of the ISAR screening tool was administered directly after triage of the patients. Follow-up telephone interviews to assess outcome variables were conducted 28 and 180 days after the index visit in the ED. The primary end point was death from any cause or hospitalization or recurrent ED visit or change of residency into a long-term care facility on day 28 after the index ED visit. The mean age ± SD was 82.8 ± 5.0 years. According to ISAR, 425 patients (81.7%) scored ≥2 points, and 315 patients (60.5%) scored ≥3 points. The combined primary end point was observed in 250 of 520 patients (48.1%) on day 28 and in 260 patients (50.0%) on day 180. Using a continuous ISAR score the area under the curve on day 28 was 0.621 (95% confidence interval, CI 0.573-0.669) and 0.661 (95% CI 0.615-0.708) on day 180, respectively. The German version of the ISAR screening tool acceptably identified elderly patients in the ED with an increased risk of a negative outcome. Using the cutoff ≥3 points instead of ≥2 points yielded better overall results.

  17. Sine-Bar Attachment For Machine Tools

    Science.gov (United States)

    Mann, Franklin D.

    1988-01-01

    Sine-bar attachment for collets, spindles, and chucks helps machinists set up quickly for precise angular cuts that require greater precision than provided by graduations of machine tools. Machinist uses attachment to index head, carriage of milling machine or lathe relative to table or turning axis of tool. Attachment accurate to 1 minute or arc depending on length of sine bar and precision of gauge blocks in setup. Attachment installs quickly and easily on almost any type of lathe or mill. Requires no special clamps or fixtures, and eliminates many trial-and-error measurements. More stable than improvised setups and not jarred out of position readily.

  18. Designing Tools for Supporting User Decision-Making in e-Commerce

    Science.gov (United States)

    Sutcliffe, Alistair; Al-Qaed, Faisal

    The paper describes a set of tools designed to support a variety of user decision-making strategies. The tools are complemented by an online advisor so they can be adapted to different domains and users can be guided to adopt appropriate tools for different choices in e-commerce, e.g. purchasing high-value products, exploring product fit to users’ needs, or selecting products which satisfy requirements. The tools range from simple recommenders to decision support by interactive querying and comparison matrices. They were evaluated in a scenario-based experiment which varied the users’ task and motivation, with and without an advisor agent. The results show the tools and advisor were effective in supporting users and agreed with the predictions of ADM (adaptive decision making) theory, on which the design of the tools was based.

  19. Hubble Space Telescope EVA Power Ratchet Tool redesign

    Science.gov (United States)

    Richards, Paul W.; Park, Chan; Brown, Lee

    The Power Ratchet Tool (PRT) is a self contained, power-driven, 3/8 inch drive ratchet wrench which will be used by astronauts during Extravehicular Activities (EVA). This battery-powered tool is controlled by a dedicated electonic controller. The PRT was flown during the Hubble Space Telescope (HST) Deployment Mission STS-31 to deploy the solar arrays if the automatic mechanisms failed. The PRT is currently intended for use during the first HST Servicing Mission STS-61 as a general purpose power tool. The PRT consists of three major components; the wrench, the controller, and the battery module. Fourteen discrete combinations of torque, turns, and speed may be programmed into the controller before the EVA. The crewmember selects the desired parameter profile by a switch mounted on the controller. The tool may also be used in the manual mode as a non-powered ratchet wrench. The power is provided by a silver-zinc battery module, which fits into the controller and is replaceable during an EVA. The original PRT did not meet the design specification of torque output and hours of operation. To increase efficiency and reliability the PRT underwent a redesign effort. The majority of this effort focused on the wrench. The original PRT drive train consisted of a low torque, high speed brushless DC motor, a face gear set, and a planocentric gear assembly. The total gear reduction was 300:1. The new PRT wrench consists of a low speed, high torque brushless DC motor, two planetary gear sets and a bevel gear set. The total gear reduction is now 75:1. A spline clutch has also been added to disengage the drive train in the manual mode. The design changes to the controller will consist of only those modifications necessary to accomodate the redesigned wrench.

  20. Hesitant fuzzy soft sets with application in multicriteria group decision making problems.

    Science.gov (United States)

    Wang, Jian-qiang; Li, Xin-E; Chen, Xiao-hong

    2015-01-01

    Soft sets have been regarded as a useful mathematical tool to deal with uncertainty. In recent years, many scholars have shown an intense interest in soft sets and extended standard soft sets to intuitionistic fuzzy soft sets, interval-valued fuzzy soft sets, and generalized fuzzy soft sets. In this paper, hesitant fuzzy soft sets are defined by combining fuzzy soft sets with hesitant fuzzy sets. And some operations on hesitant fuzzy soft sets based on Archimedean t-norm and Archimedean t-conorm are defined. Besides, four aggregation operations, such as the HFSWA, HFSWG, GHFSWA, and GHFSWG operators, are given. Based on these operators, a multicriteria group decision making approach with hesitant fuzzy soft sets is also proposed. To demonstrate its accuracy and applicability, this approach is finally employed to calculate a numerical example.

  1. Autocatalytic sets in a partitioned biochemical network.

    Science.gov (United States)

    Smith, Joshua I; Steel, Mike; Hordijk, Wim

    2014-01-01

    In previous work, RAF theory has been developed as a tool for making theoretical progress on the origin of life question, providing insight into the structure and occurrence of self-sustaining and collectively autocatalytic sets within catalytic polymer networks. We present here an extension in which there are two "independent" polymer sets, where catalysis occurs within and between the sets, but there are no reactions combining polymers from both sets. Such an extension reflects the interaction between nucleic acids and peptides observed in modern cells and proposed forms of early life. We present theoretical work and simulations which suggest that the occurrence of autocatalytic sets is robust to the partitioned structure of the network. We also show that autocatalytic sets remain likely even when the molecules in the system are not polymers, and a low level of inhibition is present. Finally, we present a kinetic extension which assigns a rate to each reaction in the system, and show that identifying autocatalytic sets within such a system is an NP-complete problem. Recent experimental work has challenged the necessity of an RNA world by suggesting that peptide-nucleic acid interactions occurred early in chemical evolution. The present work indicates that such a peptide-RNA world could support the spontaneous development of autocatalytic sets and is thus a feasible alternative worthy of investigation.

  2. PLOT3D Export Tool for Tecplot

    Science.gov (United States)

    Alter, Stephen

    2010-01-01

    The PLOT3D export tool for Tecplot solves the problem of modified data being impossible to output for use by another computational science solver. The PLOT3D Exporter add-on enables the use of the most commonly available visualization tools to engineers for output of a standard format. The exportation of PLOT3D data from Tecplot has far reaching effects because it allows for grid and solution manipulation within a graphical user interface (GUI) that is easily customized with macro language-based and user-developed GUIs. The add-on also enables the use of Tecplot as an interpolation tool for solution conversion between different grids of different types. This one add-on enhances the functionality of Tecplot so significantly, it offers the ability to incorporate Tecplot into a general suite of tools for computational science applications as a 3D graphics engine for visualization of all data. Within the PLOT3D Export Add-on are several functions that enhance the operations and effectiveness of the add-on. Unlike Tecplot output functions, the PLOT3D Export Add-on enables the use of the zone selection dialog in Tecplot to choose which zones are to be written by offering three distinct options - output of active, inactive, or all zones (grid blocks). As the user modifies the zones to output with the zone selection dialog, the zones to be written are similarly updated. This enables the use of Tecplot to create multiple configurations of a geometry being analyzed. For example, if an aircraft is loaded with multiple deflections of flaps, by activating and deactivating different zones for a specific flap setting, new specific configurations of that aircraft can be easily generated by only writing out specific zones. Thus, if ten flap settings are loaded into Tecplot, the PLOT3D Export software can output ten different configurations, one for each flap setting.

  3. Tool for Insider Threat Detection in Corporative Information Systems

    Directory of Open Access Journals (Sweden)

    Victor Sergeevich Vedeneev

    2014-02-01

    Full Text Available Systems and tools for insider threat detection are described. Different meanings of the term “insider”, types of insiders, examples of motivation of insiders, typical insider actions are set.

  4. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  5. Informed consent comprehension in African research settings.

    Science.gov (United States)

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for

  6. FREPA - A Set of Instruments for the Development of Plurilingual and Inter-/Transcultural Competences

    DEFF Research Database (Denmark)

    Daryai-Hansen, Petra Gilliyard; Schröder-Sura, Anna

    2012-01-01

    The article presents a description and instructions for use of a set of tools that seeks to facilitate learners’ continuous development and strengthen plurilingual and inter- /transcultural competences. These tools have been developed within the research project Framework of Reference for Plurali...... and defines the concept of pluralistic approaches. The FREPA tools will be presented by answering the question as to how the FREPA tools can been used to describe and develop transcultural competences. Finally, the recent perspectives of the FREPA project will be outlined....

  7. The development of an online decision support tool for organizational readiness for change.

    Science.gov (United States)

    Khan, Sobia; Timmings, Caitlyn; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Gheihman, Galina; Straus, Sharon E

    2014-05-10

    Much importance has been placed on assessing readiness for change as one of the earliest steps of implementation, but measuring it can be a complex and daunting task. Organizations and individuals struggle with how to reliably and accurately measure readiness for change. Several measures have been developed to help organizations assess readiness, but these are often underused due to the difficulty of selecting the right measure. In response to this challenge, we will develop and test a prototype of a decision support tool that is designed to guide individuals interested in implementation in the selection of an appropriate readiness assessment measure for their setting. A multi-phase approach will be used to develop the decision support tool. First, we will identify key measures for assessing organizational readiness for change from a recently completed systematic review. Included measures will be those developed for healthcare settings (e.g., acute care, public health, mental health) and that have been deemed valid and reliable. Second, study investigators and field experts will engage in a mapping exercise to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a stakeholder panel will be recruited and consulted to determine the feasibility and relevance of the selected measures using a modified Delphi process. Fourth, findings from the mapping exercise and stakeholder consultation will inform the development of a decision support tool that will guide users in appropriately selecting change readiness measures. Fifth, the tool will undergo usability testing. Our proposed decision support tool will address current challenges in the field of organizational change readiness by aiding individuals in selecting a valid and reliable assessment measure that is relevant to user needs and practice settings. We anticipate that implementers and researchers who use our tool will be more likely to conduct

  8. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Science.gov (United States)

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for

  9. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Directory of Open Access Journals (Sweden)

    Carlos Alejandro Robles-Rubio

    Full Text Available Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA. POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP signals; (ii RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii a library of data segments representing each of the 6 patterns; (iv a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness. Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential

  10. Proposal of a trigger tool to assess adverse events in dental care.

    Science.gov (United States)

    Corrêa, Claudia Dolores Trierweiler Sampaio de Oliveira; Mendes, Walter

    2017-11-21

    The aim of this study was to propose a trigger tool for research of adverse events in outpatient dentistry in Brazil. The tool was elaborated in two stages: (i) to build a preliminary set of triggers, a literature review was conducted to identify the composition of trigger tools used in other areas of health and the principal adverse events found in dentistry; (ii) to validate the preliminarily constructed triggers a panel of experts was organized using the modified Delphi method. Fourteen triggers were elaborated in a tool with explicit criteria to identify potential adverse events in dental care, essential for retrospective patient chart reviews. Studies on patient safety in dental care are still incipient when compared to other areas of health care. This study intended to contribute to the research in this field. The contribution by the literature and guidance from the expert panel allowed elaborating a set of triggers to detect adverse events in dental care, but additional studies are needed to test the instrument's validity.

  11. RdTools: An Open Source Python Library for PV Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Deceglie, Michael G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nag, Ambarish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Shinn, Adam [kWh Analytics

    2018-05-04

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  12. RBT—A Tool for Building Refined Buneman Trees

    DEFF Research Database (Denmark)

    Besenbacher, Søren; Mailund; Westh-Nielsen, Lasse

    2005-01-01

    We have developed a tool implementing an efficient algorithm for refined Buneman tree reconstruction. The algorithm—which has the same complexity as the neighbour-joining method and the (plain) Buneman tree construction—enables refined Buneman tree reconstruction on large taxa sets....

  13. A stability tool for use within TELEGRIP

    International Nuclear Information System (INIS)

    Son, W.H.; Trinkle, J.C.

    1998-12-01

    During the assembly of a product, it is vital that the partially-completed assembly be stable. To guarantee this, one must ensure that contacts among the parts and the fixtures are sufficient to stabilize the assembly. Thus, it would be desirable to have an efficient method for testing an assembly stability, and, if this is not possible, generating a set of additional fixture contact points, known as fixels, that will stabilize it. One can apply this method to the situation of safe handling of special nuclear material (SNM). To have these functionalities should help improve the safety and enhance the performance of special nuclear material (SNM) handling and storage operations, since some methods are needed for gripping objects in a stable manner. Also, one may need a way to find a pit-holding fixture inserted into containers. In this paper, the authors present a stability tool, which they call Stab Tool, which was developed to test the stability of objects grasped by robotic hands, objects placed in fixtures, or sets of objects piled randomly on top of one another. Stab Tool runs on top of a commercial software package, TELEGRIP, which is used for geometry modeling and motion creation. The successful development of the stability depends strongly on TELEGRIP's ability to compute the distances between pairs of three-dimensional bodies in the simulated environment. The interbody distance computation tool takes advantage of the polyhedral representations of bodies used by TELEGRIP and of linear programming techniques to yield an efficient algorithm

  14. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng; Masseglia, Florent; Zhang, Xiangliang

    2012-01-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  15. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng

    2012-11-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  16. Advances in type-2 fuzzy sets and systems theory and applications

    CERN Document Server

    Mendel, Jerry; Tahayori, Hooman

    2013-01-01

    This book explores recent developments in the theoretical foundations and novel applications of general and interval type-2 fuzzy sets and systems, including: algebraic properties of type-2 fuzzy sets, geometric-based definition of type-2 fuzzy set operators, generalizations of the continuous KM algorithm, adaptiveness and novelty of interval type-2 fuzzy logic controllers, relations between conceptual spaces and type-2 fuzzy sets, type-2 fuzzy logic systems versus perceptual computers; modeling human perception of real world concepts with type-2 fuzzy sets, different methods for generating membership functions of interval and general type-2 fuzzy sets, and applications of interval type-2 fuzzy sets to control, machine tooling, image processing and diet.  The applications demonstrate the appropriateness of using type-2 fuzzy sets and systems in real world problems that are characterized by different degrees of uncertainty.

  17. BEAM EMITTANCE MEASUREMENT TOOL FOR CEBAF OPERATIONS

    International Nuclear Information System (INIS)

    Chevtsov, Pavel; Tiefenback, Michael

    2008-01-01

    A new software tool was created at Jefferson Lab to measure the emittance of the CEBAF electron beams. The tool consists of device control and data analysis applications. The device control application handles the work of wire scanners and writes their measurement results as well as the information about accelerator settings during these measurements into wire scanner data files. The data analysis application reads these files and calculates the beam emittance on the basis of a wire scanner data processing model. Both applications are computer platform independent but are mostly used on LINUX PCs recently installed in the accelerator control room. The new tool significantly simplifies beam emittance measurement procedures for accelerator operations and contributes to a very high availability of the CEBAF machine for the nuclear physics program at Jefferson Lab.

  18. GOrilla: a tool for discovery and visualization of enriched GO terms in ranked gene lists

    Directory of Open Access Journals (Sweden)

    Steinfeld Israel

    2009-02-01

    Full Text Available Abstract Background Since the inception of the GO annotation project, a variety of tools have been developed that support exploring and searching the GO database. In particular, a variety of tools that perform GO enrichment analysis are currently available. Most of these tools require as input a target set of genes and a background set and seek enrichment in the target set compared to the background set. A few tools also exist that support analyzing ranked lists. The latter typically rely on simulations or on union-bound correction for assigning statistical significance to the results. Results GOrilla is a web-based application that identifies enriched GO terms in ranked lists of genes, without requiring the user to provide explicit target and background sets. This is particularly useful in many typical cases where genomic data may be naturally represented as a ranked list of genes (e.g. by level of expression or of differential expression. GOrilla employs a flexible threshold statistical approach to discover GO terms that are significantly enriched at the top of a ranked gene list. Building on a complete theoretical characterization of the underlying distribution, called mHG, GOrilla computes an exact p-value for the observed enrichment, taking threshold multiple testing into account without the need for simulations. This enables rigorous statistical analysis of thousand of genes and thousands of GO terms in order of seconds. The output of the enrichment analysis is visualized as a hierarchical structure, providing a clear view of the relations between enriched GO terms. Conclusion GOrilla is an efficient GO analysis tool with unique features that make a useful addition to the existing repertoire of GO enrichment tools. GOrilla's unique features and advantages over other threshold free enrichment tools include rigorous statistics, fast running time and an effective graphical representation. GOrilla is publicly available at: http://cbl-gorilla.cs.technion.ac.il

  19. Adapting a Technology-Based Implementation Support Tool for Community Mental Health: Challenges and Lessons Learned.

    Science.gov (United States)

    Livet, Melanie; Fixsen, Amanda

    2018-01-01

    With mental health services shifting to community-based settings, community mental health (CMH) organizations are under increasing pressure to deliver effective services. Despite availability of evidence-based interventions, there is a gap between effective mental health practices and the care that is routinely delivered. Bridging this gap requires availability of easily tailorable implementation support tools to assist providers in implementing evidence-based intervention with quality, thereby increasing the likelihood of achieving the desired client outcomes. This study documents the process and lessons learned from exploring the feasibility of adapting such a technology-based tool, Centervention, as the example innovation, for use in CMH settings. Mixed-methods data on core features, innovation-provider fit, and organizational capacity were collected from 44 CMH providers. Lessons learned included the need to augment delivery through technology with more personal interactions, the importance of customizing and integrating the tool with existing technologies, and the need to incorporate a number of strategies to assist with adoption and use of Centervention-like tools in CMH contexts. This study adds to the current body of literature on the adaptation process for technology-based tools and provides information that can guide additional innovations for CMH settings.

  20. Calculation and construction of a beam-transport system for polarized electrons

    International Nuclear Information System (INIS)

    Marschke, G.

    1987-09-01

    In the framework of the ELSA-SAPHIR project a transfer channel between ELSA and the large-space detector SAPHIR was calculated and constructed. Existing optical elements were modified corresponding to their application and the missing racks constructed and ordered for fabrication. Furthermore the vacuum system was designed as the whole as well as in the single components. Starting from the architectonic conditions and the optics to be realized the coordinates of the elements were calculated as preconditions fo the geodetic measurements and calibrations. It was shown that both for a polarized and for an unpolarized electron beam an optic was realized corresponding to the requirements up to an energy of 3.5 GeV. Under the given conditions, the applied method of the rotation of the polarization vector, and the geometrical preconditions up to 3.0 GeV also an acceptable longitudinal polarization was reached. (orig./HSI) [de

  1. Characterizing Aerosols over Southeast Asia using the AERONET Data Synergy Tool

    Science.gov (United States)

    Giles, David M.; Holben, Brent N.; Eck, Thomas F.; Slutsker, Ilya; Slutsker, Ilya; Welton, Ellsworth, J.; Chin, Mian; Kucsera, Thomas; Schmaltz, Jeffery E.; Diehl, Thomas; hide

    2007-01-01

    Biomass burning, urban pollution and dust aerosols have significant impacts on the radiative forcing of the atmosphere over Asia. In order to better quanti@ these aerosol characteristics, the Aerosol Robotic Network (AERONET) has established over 200 sites worldwide with an emphasis in recent years on the Asian continent - specifically Southeast Asia. A total of approximately 15 AERONET sun photometer instruments have been deployed to China, India, Pakistan, Thailand, and Vietnam. Sun photometer spectral aerosol optical depth measurements as well as microphysical and optical aerosol retrievals over Southeast Asia will be analyzed and discussed with supporting ground-based instrument, satellite, and model data sets, which are freely available via the AERONET Data Synergy tool at the AERONET web site (http://aeronet.gsfc.nasa.gov). This web-based data tool provides access to groundbased (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  2. Modeling a Decision Support Tool for Buildable and Sustainable Building Envelope Designs

    Directory of Open Access Journals (Sweden)

    Natee Singhaputtangkul

    2015-05-01

    Full Text Available Sustainability and buildability requirements in building envelope design have significantly gained more importance nowadays, yet there is a lack of an appropriate decision support system (DSS that can help a building design team to incorporate these requirements and manage their tradeoffs at once. The main objective of this study is to build such a tool to facilitate a building design team to take into account sustainability and buildability criteria for assessment of building envelopes of high-rise residential buildings in Singapore. Literature reviews were conducted to investigate a comprehensive set of the sustainability and buildability criteria. This also included development of the tool using a Quality Functional Deployment (QFD approach combined with fuzzy set theory. A building design team was engaged to test the tool with the aim to evaluate usefulness of the tool in managing the tradeoffs among the sustainability and buildability criteria. The results from a qualitative data analysis suggested that the tool allowed the design team to effectively find a balance between the tradeoffs among the criteria when assessing multiple building envelope design alternatives. Main contributions of using this tool are achievement of a more efficient assessment of the building envelopes and more sustainable and buildable building envelope design.

  3. ncRNA-class Web Tool: Non-coding RNA feature extraction and pre-miRNA classification web tool

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Theofilatos, Konstantinos A.; Papadimitriou, Stergios; Tsakalidis, Athanasios K.; Likothanassis, Spiridon D.; Mavroudi, Seferina P.

    2012-01-01

    Until recently, it was commonly accepted that most genetic information is transacted by proteins. Recent evidence suggests that the majority of the genomes of mammals and other complex organisms are in fact transcribed into non-coding RNAs (ncRNAs), many of which are alternatively spliced and/or processed into smaller products. Non coding RNA genes analysis requires the calculation of several sequential, thermodynamical and structural features. Many independent tools have already been developed for the efficient calculation of such features but to the best of our knowledge there does not exist any integrative approach for this task. The most significant amount of existing work is related to the miRNA class of non-coding RNAs. MicroRNAs (miRNAs) are small non-coding RNAs that play a significant role in gene regulation and their prediction is a challenging bioinformatics problem. Non-coding RNA feature extraction and pre-miRNA classification Web Tool (ncRNA-class Web Tool) is a publicly available web tool ( http://150.140.142.24:82/Default.aspx ) which provides a user friendly and efficient environment for the effective calculation of a set of 58 sequential, thermodynamical and structural features of non-coding RNAs, plus a tool for the accurate prediction of miRNAs. © 2012 IFIP International Federation for Information Processing.

  4. PREP: Production and Reprocessing management tool for CMS

    International Nuclear Information System (INIS)

    Cossutti, F; Lenzi, P; Naziridis, N; Samyn, D; Stöckli, F

    2012-01-01

    The production of simulated samples for physics analysis at LHC represents a noticeable organization challenge, because it requires the management of several thousands different workflows. The submission of a workflow to the grid based computing infrastructure starts with the definition of the general characteristics of a given set of coherent samples (called a ‘campaign'), up to the definition of the physics settings to be used for each sample corresponding to a specific process to be simulated, both at hard event generation and detector simulation level. In order to have an organized control of the of the definition of the large number of MC samples needed by CMS, a dedicated management tool, called PREP, has been built. Its basic component is a database storing all the relevant information about the sample and the actions implied by the workflow definition, approval and production. A web based interface allows the database to be used from experts involved in production to trigger all the different actions needed, as well as by normal physicists involved in analyses to retrieve the relevant information. The tool is integrated through a set of dedicated APIs with the production agent and information storage utilities of CMS.

  5. Development of a music therapy assessment tool for patients in low awareness states.

    Science.gov (United States)

    Magee, Wendy L

    2007-01-01

    People in low awareness states following profound brain injury typically demonstrate subtle changes in functional behaviors which challenge the sensitivity of measurement tools. Failure to identify and measure changes in functioning can lead to misdiagnosis and withdrawal of treatment with this population. Thus, the development of tools which are sensitive to responsiveness is of central concern. As the auditory modality has been found to be particularly sensitive in identifying responses indicating awareness, a convincing case can be made for music therapy as a treatment medium. However, little has been recommended about protocols for intervention or tools for measuring patient responses within the music therapy setting. This paper presents the rationale for an assessment tool specifically designed to measure responses in the music therapy setting with patients who are diagnosed as minimally conscious or in a vegetative state. Developed over fourteen years as part of interdisciplinary assessment and treatment, the music therapy assessment tool for low awareness states (MATLAS) contains fourteen items which rate behavioral responses across a number of domains. The tool can provide important information for interdisciplinary assessment and treatment particularly in the auditory and communication domains. Recommendations are made for testing its reliability and validity through research.

  6. Pharmaceutical interventions on prescription problems in a Danish pharmacy setting

    DEFF Research Database (Denmark)

    Pottegård, Anton; Hallas, Jesper; Søndergaard, Jens

    2011-01-01

    International studies regarding pharmacists' interventions towards prescription problems produce highly variable results. The only peer-reviewed study in a Danish setting estimated an intervention rate of 2.3 per 1,000 prescriptions. With the introduction of a new tool for registration, we...

  7. OpenDOAR Policy tools and applications

    CERN Document Server

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    OpenDOAR conducted a survey of the world's repositories that showed 2/3 as having unusable or missing policies for content re-use and other issues. Such policies are essential for service providers to be able to develop innovative services that use the full potential of open access. OpenDOAR has developed a set of policy generator tools for repository administrators and is contacting administrators to advocate policy development. It is hoped that one outcome from this work will be some standardisation of policies in vocabulary and intent. Other developments include an OpenDOAR API. This presentation looks at the way that the tools and API have been developed and the implcations for their use.

  8. OpenDOAR Policy tools and applications

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    OpenDOAR conducted a survey of the world's repositories that showed 2/3 as having unusable or missing policies for content re-use and other issues. Such policies are essential for service providers to be able to develop innovative services that use the full potential of open access. OpenDOAR has developed a set of policy generator tools for repository administrators and is contacting administrators to advocate policy development. It is hoped that one outcome from this work will be some standardisation of policies in vocabulary and intent. Other developments include an OpenDOAR API. This presentation looks at the way that the tools and API have been developed and the implcations for their use. View Bill Hubbard's biography

  9. Sandia Generated Matrix Tool (SGMT) v. 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven's Progressive Matrices (RPMs)™. The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then create any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven's Progressive Matrices (RPMs) ™ are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.

  10. Principles for generation of time-dependent collimator settings during the LHC cycle

    CERN Document Server

    Bruce, R; Redaelli, S

    2011-01-01

    The settings of the LHC collimators have to be changed during the cycle of injection, ramp and squeeze to account for variations in the orbit, beam size and normalized distance to the beam center. We discuss the principles for how the settings are calculated and show a software tool that computes them as time-dependent functions from beambased data and theoretical optics models.

  11. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    Science.gov (United States)

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  12. Efficient Data Generation and Publication as a Test Tool

    Science.gov (United States)

    Einstein, Craig Jakob

    2017-01-01

    A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.

  13. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    Science.gov (United States)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  14. Laparohysteroscopy in female infertility: A diagnostic cum therapeutic tool in Indian setting.

    Science.gov (United States)

    Puri, Suman; Jain, Dinesh; Puri, Sandeep; Kaushal, Sandeep; Deol, Satjeet Kaur

    2015-01-01

    To evaluate the role of laparohysteroscopy in female infertility andto study the effect of therapeutic procedures in achieving fertility. Patients with female infertility presenting to outpatient Department of Obstetrics and Gynecology were evaluated over a period of 18 months. Fifty consenting subjects excluding male factor infertility with normal hormonal profile and no contraindication to laparoscopy were subject to diagnostic laparoscopy and hysteroscopy. T-test. We studied 50 patients comprising of 24 (48%) cases of primary infertility and 26 (52%) patients of secondary infertility. The average age of active married life for 50 patients was between 8 and 9 years. In our study, the most commonly found pathologies were PCOD, endometroisis and tubal blockage. 11 (28.2) patients conceived after laparohysteroscopy followed by artificial reproductive techniques. This study demonstrates the benefit of laparohysteroscopy for diagnosis and as a therapeutic tool in patients with primary and secondary infertility. We were able to achieve a higher conception rate of 28.2%.

  15. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    Halatsis, C.

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  16. On some classical problems of descriptive set theory

    International Nuclear Information System (INIS)

    Kanovei, Vladimir G; Lyubetskii, Vasilii A

    2003-01-01

    The centenary of P.S. Novikov's birth provides an inspiring motivation to present, with full proofs and from a modern standpoint, the presumably definitive solutions of some classical problems in descriptive set theory which were formulated by Luzin [Lusin] and, to some extent, even earlier by Hadamard, Borel, and Lebesgue and relate to regularity properties of point sets. The solutions of these problems began in the pioneering works of Aleksandrov [Alexandroff], Suslin [Souslin], and Luzin (1916-17) and evolved in the fundamental studies of Goedel, Novikov, Cohen, and their successors. Main features of this branch of mathematics are that, on the one hand, it is an ordinary mathematical theory studying natural properties of point sets and functions and rather distant from general set theory or intrinsic problems of mathematical logic like consistency or Goedel's theorems, and on the other hand, it has become a subject of applications of the most subtle tools of modern mathematical logic

  17. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  18. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  19. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  20. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    Science.gov (United States)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  1. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  2. Counting SET-free sets

    OpenAIRE

    Harman, Nate

    2016-01-01

    We consider the following counting problem related to the card game SET: How many $k$-element SET-free sets are there in an $n$-dimensional SET deck? Through a series of algebraic reformulations and reinterpretations, we show the answer to this question satisfies two polynomiality conditions.

  3. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  4. Community-based field implementation scenarios of a short message service reporting tool for lymphatic filariasis case estimates in Africa and Asia.

    Science.gov (United States)

    Mableson, Hayley E; Martindale, Sarah; Stanton, Michelle C; Mackenzie, Charles; Kelly-Hope, Louise A

    2017-01-01

    Lymphatic filariasis (LF) is a neglected tropical disease (NTD) targeted for global elimination by 2020. Currently there is considerable international effort to scale-up morbidity management activities in endemic countries, however there remains a need for rapid, cost-effective methods and adaptable tools for obtaining estimates of people presenting with clinical manifestations of LF, namely lymphoedema and hydrocele. The mHealth tool ' MeasureSMS-Morbidity ' allows health workers in endemic areas to use their own mobile phones to send clinical information in a simple format using short message service (SMS). The experience gained through programmatic use of the tool in five endemic countries across a diversity of settings in Africa and Asia is used here to present implementation scenarios that are suitable for adapting the tool for use in a range of different programmatic, endemic, demographic and health system settings. A checklist of five key factors and sub-questions was used to determine and define specific community-based field implementation scenarios for using the MeasureSMS-Morbidity tool in a range of settings. These factors included: (I) tool feasibility (acceptability; community access and ownership); (II) LF endemicity (high; low prevalence); (III) population demography (urban; rural); (IV) health system structure (human resources; community access); and (V) integration with other diseases (co-endemicity). Based on experiences in Bangladesh, Ethiopia, Malawi, Nepal and Tanzania, four implementation scenarios were identified as suitable for using the MeasureSMS-Morbidity tool for searching and reporting LF clinical case data across a range of programmatic, endemic, demographic and health system settings. These include: (I) urban, high endemic setting with two-tier reporting; (II) rural, high endemic setting with one-tier reporting; (III) rural, high endemic setting with two-tier reporting; and (IV) low-endemic, urban and rural setting with one

  5. Empirical comparison of web-based antimicrobial peptide prediction tools.

    Science.gov (United States)

    Gabere, Musa Nur; Noble, William Stafford

    2017-07-01

    Antimicrobial peptides (AMPs) are innate immune molecules that exhibit activities against a range of microbes, including bacteria, fungi, viruses and protozoa. Recent increases in microbial resistance against current drugs has led to a concomitant increase in the need for novel antimicrobial agents. Over the last decade, a number of AMP prediction tools have been designed and made freely available online. These AMP prediction tools show potential to discriminate AMPs from non-AMPs, but the relative quality of the predictions produced by the various tools is difficult to quantify. We compiled two sets of AMP and non-AMP peptides, separated into three categories-antimicrobial, antibacterial and bacteriocins. Using these benchmark data sets, we carried out a systematic evaluation of ten publicly available AMP prediction methods. Among the six general AMP prediction tools-ADAM, CAMPR3(RF), CAMPR3(SVM), MLAMP, DBAASP and MLAMP-we find that CAMPR3(RF) provides a statistically significant improvement in performance, as measured by the area under the receiver operating characteristic (ROC) curve, relative to the other five methods. Surprisingly, for antibacterial prediction, the original AntiBP method significantly outperforms its successor, AntiBP2 based on one benchmark dataset. The two bacteriocin prediction tools, BAGEL3 and BACTIBASE, both provide very good performance and BAGEL3 outperforms its predecessor, BACTIBASE, on the larger of the two benchmarks. gaberemu@ngha.med.sa or william-noble@uw.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Automated Video Analysis of Non-verbal Communication in a Medical Setting.

    Science.gov (United States)

    Hart, Yuval; Czerniak, Efrat; Karnieli-Miller, Orit; Mayo, Avraham E; Ziv, Amitai; Biegon, Anat; Citron, Atay; Alon, Uri

    2016-01-01

    Non-verbal communication plays a significant role in establishing good rapport between physicians and patients and may influence aspects of patient health outcomes. It is therefore important to analyze non-verbal communication in medical settings. Current approaches to measure non-verbal interactions in medicine employ coding by human raters. Such tools are labor intensive and hence limit the scale of possible studies. Here, we present an automated video analysis tool for non-verbal interactions in a medical setting. We test the tool using videos of subjects that interact with an actor portraying a doctor. The actor interviews the subjects performing one of two scripted scenarios of interviewing the subjects: in one scenario the actor showed minimal engagement with the subject. The second scenario included active listening by the doctor and attentiveness to the subject. We analyze the cross correlation in total kinetic energy of the two people in the dyad, and also characterize the frequency spectrum of their motion. We find large differences in interpersonal motion synchrony and entrainment between the two performance scenarios. The active listening scenario shows more synchrony and more symmetric followership than the other scenario. Moreover, the active listening scenario shows more high-frequency motion termed jitter that has been recently suggested to be a marker of followership. The present approach may be useful for analyzing physician-patient interactions in terms of synchrony and dominance in a range of medical settings.

  7. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  8. Indico central - events organisation, ergonomics and collaboration tools integration

    International Nuclear Information System (INIS)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  9. Indico central - events organisation, ergonomics and collaboration tools integration

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas, E-mail: jose.benito.gonzalez@cern.c, E-mail: jose.pedro.ferreira@cern.c, E-mail: thomas.baron@cern.c [CERN IT-UDS-AVC, 1211 Geneve 23 (Switzerland)

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  10. Indico Central - Events Organisation, Ergonomics and Collaboration Tools Integration

    CERN Document Server

    Gonzalez Lopez, J B; Baron, T; CERN. Geneva. IT Department

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  11. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  12. Qualitative interviewing: methodological challenges in Arab settings.

    Science.gov (United States)

    Hawamdeh, Sana; Raigangar, Veena

    2014-01-01

    To explore some of the main methodological challenges faced by interviewers in Arab settings, particularly during interviews with psychiatric nurses. Interviews are a tool used commonly in qualitative research. However, the cultural norms and practices of interviewees must be considered to ensure that an appropriate interviewing style is used, a good interviewee-interviewer relationship formed and consent for participation obtained sensitively. A study to explore the nature of psychiatric nurses' practices that used unstructured interviews. This is a methodology paper that discusses a personal experience of addressing many challenges that are specific to qualitative interviewing in Arab settings, supported by literature on the topic. Suggestions for improving the interview process to make it more culturally sensitive are provided and recommendations for future research are made. Openness, flexibility and a reflexive approach by the researcher can help manage challenges in Arab settings. Researchers should allow themselves to understand the cultural elements of a population to adapt interviewing methods with the aim of generating high quality qualitative research.

  13. Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit

    Science.gov (United States)

    French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory

    2005-01-01

    The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.

  14. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  15. Planning Tools For Estimating Radiation Exposure At The National Ignition Facility

    International Nuclear Information System (INIS)

    Verbeke, J.; Young, M.; Brereton, S.; Dauffy, L.; Hall, J.; Hansen, L.; Khater, H.; Kim, S.; Pohl, B.; Sitaraman, S.

    2010-01-01

    A set of computational tools was developed to help estimate and minimize potential radiation exposure to workers from material activation in the National Ignition Facility (NIF). AAMI (Automated ALARA-MCNP Interface) provides an efficient, automated mechanism to perform the series of calculations required to create dose rate maps for the entire facility with minimal manual user input. NEET (NIF Exposure Estimation Tool) is a web application that combines the information computed by AAMI with a given shot schedule to compute and display the dose rate maps as a function of time. AAMI and NEET are currently used as work planning tools to determine stay-out times for workers following a given shot or set of shots, and to help in estimating integrated doses associated with performing various maintenance activities inside the target bay. Dose rate maps of the target bay were generated following a low-yield 10 16 D-T shot and will be presented in this paper.

  16. Citizen's Charter in a primary health-care setting of Nepal: An accountability tool or a "mere wall poster"?

    Science.gov (United States)

    Gurung, Gagan; Gauld, Robin; Hill, Philip C; Derrett, Sarah

    2018-02-01

    Despite some empirical findings on the usefulness of citizen's charters on awareness of rights and services, there is a dearth of literature about charter implementation and impact on health service delivery in low-income settings. To gauge the level of awareness of the Charter within Nepal's primary health-care (PHC) system, perceived impact and factors affecting Charter implementation. Using a case study design, a quantitative survey was administered to 400 participants from 22 of 39 PHC facilities in the Dang District to gauge awareness of the Charter. Additionally, qualitative interviews with 39 key informants were conducted to explore the perceived impact of the Charter and factors affecting its implementation. Few service users (15%) were aware of the existence of the Charter. Among these, a greater proportion were literate, and there were also differences according to ethnicity and occupational group. The Charter was usually not properly displayed and had been implemented with no prior public consultation. It contained information that provided awareness of health facility services, particularly the more educated public, but had limited potential for increasing transparency and holding service providers accountable to citizens. Proper display, consultation with stakeholders, orientation or training and educational factors, follow-up and monitoring, and provision of sanctions were all lacking, negatively influencing the implementation of the Charter. Poor implementation and low public awareness of the Charter limit its usefulness. Provision of sanctions and consultation with citizens in Charter development are needed to expand the scope of Charters from information brochures to tools for accountability. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  17. Setting priorities for ambient air quality objectives

    International Nuclear Information System (INIS)

    2004-10-01

    Alberta has ambient air quality objectives in place for several pollutants, toxic substances and other air quality parameters. A process is in place to determine if additional air quality objectives are required or if existing objectives should be changed. In order to identify the highest priority substances that may require an ambient air quality objective to protect ecosystems and public health, a rigorous, transparent and cost effective priority setting methodology is required. This study reviewed, analyzed and assessed successful priority setting techniques used by other jurisdictions. It proposed an approach for setting ambient air quality objective priorities that integrates the concerns of stakeholders with Alberta Environment requirements. A literature and expert review were used to examine existing priority-setting techniques used by other jurisdictions. An analysis process was developed to identify the strengths and weaknesses of various techniques and their ability to take into account the complete pathway between chemical emissions and damage to human health or the environment. The key strengths and weaknesses of each technique were identified. Based on the analysis, the most promising technique was the tool for the reduction and assessment of chemical and other environmental impacts (TRACI). Several considerations for using TRACI to help set priorities for ambient air quality objectives were also presented. 26 refs, 8 tabs., 4 appendices

  18. MatchingTools: A Python library for symbolic effective field theory calculations

    Science.gov (United States)

    Criado, Juan C.

    2018-06-01

    MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.

  19. Challenges in Getting Building Performance Monitoring Tools for Everyday Use: User Experiences with A New Tool

    Directory of Open Access Journals (Sweden)

    Heikki Ihasalo

    2014-05-01

    Full Text Available There is a need for building performance monitoring because it is common that buildings do not perform as intended. A number of advanced tools for the purpose have been developed within the last tens of years. However, these tools have not been widely adopted in real use. A new tool presented here utilizes building automation data and transforms the data into a set of performance metrics, and is capable of visualizing building performance from energy, indoor conditions, and HVAC (heating, ventilation and air conditioning system perspectives. The purpose of this paper is to study the users’ perceptions of the use of tool. The research method was semi-structured interviews. Although the users were satisfied with the solution in general, it was not taken into operative use. The main challenges with the use of the solution were related to accessibility, trust, and management practices. The interviewees were struggling to manage with numerous information systems and therefore had problems in finding the solution and authenticating to it. All the interviewees did not fully trust the solution, since they did not entirely understand what the performance metrics meant or because the solution had limitations in assessing building performance. Management practices are needed to support the performance measurement philosophy.

  20. Internet MEMS design tools based on component technology

    Science.gov (United States)

    Brueck, Rainer; Schumer, Christian

    1999-03-01

    The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.

  1. The use of machine learning and nonlinear statistical tools for ADME prediction.

    Science.gov (United States)

    Sakiyama, Yojiro

    2009-02-01

    Absorption, distribution, metabolism and excretion (ADME)-related failure of drug candidates is a major issue for the pharmaceutical industry today. Prediction of ADME by in silico tools has now become an inevitable paradigm to reduce cost and enhance efficiency in pharmaceutical research. Recently, machine learning as well as nonlinear statistical tools has been widely applied to predict routine ADME end points. To achieve accurate and reliable predictions, it would be a prerequisite to understand the concepts, mechanisms and limitations of these tools. Here, we have devised a small synthetic nonlinear data set to help understand the mechanism of machine learning by 2D-visualisation. We applied six new machine learning methods to four different data sets. The methods include Naive Bayes classifier, classification and regression tree, random forest, Gaussian process, support vector machine and k nearest neighbour. The results demonstrated that ensemble learning and kernel machine displayed greater accuracy of prediction than classical methods irrespective of the data set size. The importance of interaction with the engineering field is also addressed. The results described here provide insights into the mechanism of machine learning, which will enable appropriate usage in the future.

  2. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  3. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  4. Correlation of microstructure and fatigue crack growth resistance in Ti-6Al-4V alloy

    CSIR Research Space (South Africa)

    Masete, Stephen

    2016-10-01

    Full Text Available Opal 450 mounting press. The specimens were ground and polished using an ATM Saphir 550 polisher and were thereafter cleaned in the ultrasonic bath using ethanol for about 10 minutes and then air dried. Etching was done using Kroll’s reagent (1 ml HF...

  5. Sustainability assessment in the 21. century. Tools, trends and applications. Symposium abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Focus on sustainability of products and corporations has been increasing over the last decade. New market trends develop, engendering new tools and application areas with the purpose of increasing sustainability, thus setting new demands for industry and academia. The 2012 SETAC LCA Case Study Symposium focuses on the experiences gained in industry and academia on the application of LCA and on the application of new tools for sustainability assessment. These tools may relate to environmental 'footstep' assessments, such as carbon, water or chemical footprints, as well as life cycle oriented tools for assessing other dimensions of sustainability. (LN)

  6. Sustainability assessment in the 21. century. Tools, trends and applications. Symposium abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Focus on sustainability of products and corporations has been increasing over the last decade. New market trends develop, engendering new tools and application areas with the purpose of increasing sustainability, thus setting new demands for industry and academia. The 2012 SETAC LCA Case Study Symposium focuses on the experiences gained in industry and academia on the application of LCA and on the application of new tools for sustainability assessment. These tools may relate to environmental 'footstep' assessments, such as carbon, water or chemical footprints, as well as life cycle oriented tools for assessing other dimensions of sustainability. (LN)

  7. Terminology tools: state of the art and practical lessons.

    Science.gov (United States)

    Cimino, J J

    2001-01-01

    As controlled medical terminologies evolve from simple code-name-hierarchy arrangements, into rich, knowledge-based ontologies of medical concepts, increased demands are placed on both the developers and users of the terminologies. In response, researchers have begun developing tools to address their needs. The aims of this article are to review previous work done to develop these tools and then to describe work done at Columbia University and New York Presbyterian Hospital (NYPH). Researchers working with the Systematized Nomenclature of Medicine (SNOMED), the Unified Medical Language System (UMLS), and NYPH's Medical Entities Dictionary (MED) have created a wide variety of terminology browsers, editors and servers to facilitate creation, maintenance and use of these terminologies. Although much work has been done, no generally available tools have yet emerged. Consensus on requirement for tool functions, especially terminology servers is emerging. Tools at NYPH have been used successfully to support the integration of clinical applications and the merger of health care institutions. Significant advancement has occurred over the past fifteen years in the development of sophisticated controlled terminologies and the tools to support them. The tool set at NYPH provides a case study to demonstrate one feasible architecture.

  8. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  9. Family history tools in primary care: does one size fit all?

    Science.gov (United States)

    Wilson, B J; Carroll, J C; Allanson, J; Little, J; Etchegary, H; Avard, D; Potter, B K; Castle, D; Grimshaw, J M; Chakraborty, P

    2012-01-01

    Family health history (FHH) has potential value in many health care settings. This review discusses the potential uses of FHH information in primary care and the need for tools to be designed accordingly. We developed a framework in which the attributes of FHH tools are mapped against these different purposes. It contains 7 attributes mapped against 5 purposes. In considering different FHH tool purposes, it is apparent that different attributes become more or less important, and that tools for different purposes require different implementation and evaluation strategies. The context in which a tool is used is also relevant to its effectiveness. For FHH tools, it is unlikely that 'one size fits all', although appreciation of different purposes, users and contexts should facilitate the development of different applications from single FHH platforms. Copyright © 2012 S. Karger AG, Basel.

  10. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Simic, Zdenko; Banov, Reni

    2014-01-01

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  11. Mindfulness: An Underused Tool for Deepening Music Understanding

    Science.gov (United States)

    Falter, H. Ellie

    2016-01-01

    Music teachers aim to deepen their students' music understanding. An underused tool for doing so is incorporating mindful practice into music teaching. Through discussing research, examples from the classroom, and steps for incorporating mindful practices in lesson planning, the author hopes to illustrate its potential benefits and set music…

  12. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  13. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  14. Quality assurance tool for organ at risk delineation in radiation therapy using a parametric statistical approach.

    Science.gov (United States)

    Hui, Cheukkai B; Nourzadeh, Hamidreza; Watkins, William T; Trifiletti, Daniel M; Alonso, Clayton E; Dutta, Sunil W; Siebers, Jeffrey V

    2018-02-26

    To develop a quality assurance (QA) tool that identifies inaccurate organ at risk (OAR) delineations. The QA tool computed volumetric features from prior OAR delineation data from 73 thoracic patients to construct a reference database. All volumetric features of the OAR delineation are computed in three-dimensional space. Volumetric features of a new OAR are compared with respect to those in the reference database to discern delineation outliers. A multicriteria outlier detection system warns users of specific delineation outliers based on combinations of deviant features. Fifteen independent experimental sets including automatic, propagated, and clinically approved manual delineation sets were used for verification. The verification OARs included manipulations to mimic common errors. Three experts reviewed the experimental sets to identify and classify errors, first without; and then 1 week after with the QA tool. In the cohort of manual delineations with manual manipulations, the QA tool detected 94% of the mimicked errors. Overall, it detected 37% of the minor and 85% of the major errors. The QA tool improved reviewer error detection sensitivity from 61% to 68% for minor errors (P = 0.17), and from 78% to 87% for major errors (P = 0.02). The QA tool assists users to detect potential delineation errors. QA tool integration into clinical procedures may reduce the frequency of inaccurate OAR delineation, and potentially improve safety and quality of radiation treatment planning. © 2018 American Association of Physicists in Medicine.

  15. A CLASSIC FRAMEWORK OF ONLINE MARKETING TOOLS

    Directory of Open Access Journals (Sweden)

    Popa Adela Laura

    2015-07-01

    Full Text Available The present paper starts from the assumption that there is a tendency, especially among practitioners, to largely overlap concepts of online marketing and online advertising, thus considering that most online marketing tools aim at the aspect of value communication and promotion. This observation prompted us to try to delineate the categories of online marketing tools according to the traditional areas of marketing activity. Therefore, the paper aims to present the online marketing tools based on a different vision than the literature identified so far. Thus, it was intended to group the online marketing tools on the key components of the marketing activity and the presentation, for each, of certain software tools that support that. The way in which the analysis of online marketing tools was addressed is new and could be useful for defining a structured vision on the field. The paper aims both to analyze concepts specific to online marketing, and especially to carry out a delineation of categories of online marketing tools based on the key areas of marketing such as value creation, value delivery, value communication / promotion, customer relationship management and marketing research. To achieve the goal set for this paper we considered useful to address the issue from a dual perspective: from the perspective of the academic literature - books, studies found in scientific databases - which deal with the topic of online marketing and online marketing tools; and from the perspective of practitioners - studies posted on the Internet by the specialists in the field, respectively the analysis of websites of companies providing online marketing services. The intention was to combine the vision specific to theorists to that of practitioners in tackling the field specific to online marketing and online marketing tools. In order to synthesize the information presented in this paper, we also conducted a visual representation of the categories of online

  16. Setting-related influences on physical inactivity of older adults in residential care settings: a review.

    Science.gov (United States)

    Douma, Johanna G; Volkers, Karin M; Engels, Gwenda; Sonneveld, Marieke H; Goossens, Richard H M; Scherder, Erik J A

    2017-04-28

    Despite the detrimental effects of physical inactivity for older adults, especially aged residents of residential care settings may spend much time in inactive behavior. This may be partly due to their poorer physical condition; however, there may also be other, setting-related factors that influence the amount of inactivity. The aim of this review was to review setting-related factors (including the social and physical environment) that may contribute to the amount of older adults' physical inactivity in a wide range of residential care settings (e.g., nursing homes, assisted care facilities). Five databases were systematically searched for eligible studies, using the key words 'inactivity', 'care facilities', and 'older adults', including their synonyms and MeSH terms. Additional studies were selected from references used in articles included from the search. Based on specific eligibility criteria, a total of 12 studies were included. Quality of the included studies was assessed using the Mixed Methods Appraisal Tool (MMAT). Based on studies using different methodologies (e.g., interviews and observations), and of different quality (assessed quality range: 25-100%), we report several aspects related to the physical environment and caregivers. Factors of the physical environment that may be related to physical inactivity included, among others, the environment's compatibility with the abilities of a resident, the presence of equipment, the accessibility, security, comfort, and aesthetics of the environment/corridors, and possibly the presence of some specific areas. Caregiver-related factors included staffing levels, the available time, and the amount and type of care being provided. Inactivity levels in residential care settings may be reduced by improving several features of the physical environment and with the help of caregivers. Intervention studies could be performed in order to gain more insight into causal effects of improving setting-related factors on

  17. DengueTools: innovative tools and strategies for the surveillance and control of dengue.

    Science.gov (United States)

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.

  18. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis

    OpenAIRE

    Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and prot...

  19. Splendidly blended: a machine learning set up for CDU control

    Science.gov (United States)

    Utzny, Clemens

    2017-06-01

    As the concepts of machine learning and artificial intelligence continue to grow in importance in the context of internet related applications it is still in its infancy when it comes to process control within the semiconductor industry. Especially the branch of mask manufacturing presents a challenge to the concepts of machine learning since the business process intrinsically induces pronounced product variability on the background of small plate numbers. In this paper we present the architectural set up of a machine learning algorithm which successfully deals with the demands and pitfalls of mask manufacturing. A detailed motivation of this basic set up followed by an analysis of its statistical properties is given. The machine learning set up for mask manufacturing involves two learning steps: an initial step which identifies and classifies the basic global CD patterns of a process. These results form the basis for the extraction of an optimized training set via balanced sampling. A second learning step uses this training set to obtain the local as well as global CD relationships induced by the manufacturing process. Using two production motivated examples we show how this approach is flexible and powerful enough to deal with the exacting demands of mask manufacturing. In one example we show how dedicated covariates can be used in conjunction with increased spatial resolution of the CD map model in order to deal with pathological CD effects at the mask boundary. The other example shows how the model set up enables strategies for dealing tool specific CD signature differences. In this case the balanced sampling enables a process control scheme which allows usage of the full tool park within the specified tight tolerance budget. Overall, this paper shows that the current rapid developments off the machine learning algorithms can be successfully used within the context of semiconductor manufacturing.

  20. An approach and a tool for setting sustainable energy retrofitting strategies referring to the 2010 EP

    Directory of Open Access Journals (Sweden)

    Charlot-Valdieu, C.

    2011-10-01

    Full Text Available The 2010 EPBD asks for an economic and social analysis in order to preserve social equity and to promote innovation and building productivity. This is possible with a life cycle energy cost (LCEC analysis, such as with the SEC (Sustainable Energy Cost model whose bottom up approach begins with a building typology including inhabitants. Then the analysis of some representative buildings includes the identification of a technico-economical optimum and energy retrofitting scenarios for each retrofitting programme and the extrapolation for the whole building stock. An extrapolation for the whole building stock allows to set up the strategy and to identify the needed means for reaching the objectives. SEC is a decision aid tool for optimising sustainable energy retrofitting strategies for buildings at territorial and patrimonial scales inside a sustainable development approach towards the factor 4. Various versions of the SEC model are now available for housing and for tertiary buildings.

    La directiva europea de 2010 sobre eficiencia energética en los edificios exige un análisis económico y social con el objetivo de preservar la equidad social, promover la innovación y reforzar la productividad en la construcción. Esto es posible con el análisis del coste global ampliado y especialmente con el modelo SEC. El análisis “bottom up” realizado con la SEC se basa en una tipología de edificio/usuario y en el análisis de edificios representativos: la identificación del óptimo técnico-económico y elaboración de escenarios antes de hacer una extrapolación al conjunto del parque. SEC es una herramienta de ayuda a la decisión para desarrollar estrategias territoriales o patrimoniales de rehabilitación energética. Existen diversas versiones del modelo: para edificios residenciales (unifamiliares y plurifamiliares, públicos y privados y para edificios terciarios.

  1. Clinical Guide to Music Therapy in Physical Rehabilitation Settings

    Science.gov (United States)

    Wong, Elizabeth

    2004-01-01

    Elizabeth Wong, MT-BC presents tools and information designed to arm the entry-level music therapist (or an experienced MT-BC new to rehabilitation settings) with basic knowledge and materials to develop or work in a music therapy program treating people with stroke, brain injury, and those who are ventilator dependent. Ms. Wong offers goals and…

  2. Fleet Tools; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-04-01

    From beverage distributors to shipping companies and federal agencies, industry leaders turn to the National Renewable Energy Laboratory (NREL) to help green their fleet operations. Cost, efficiency, and reliability are top priorities for fleets, and NREL partners know the lab’s portfolio of tools can pinpoint fuel efficiency and emissions-reduction strategies that also support operational the bottom line. NREL is one of the nation’s foremost leaders in medium- and heavy-duty vehicle research and development (R&D) and the go-to source for credible, validated transportation data. NREL developers have drawn on this expertise to create tools grounded in the real-world experiences of commercial and government fleets. Operators can use this comprehensive set of technology- and fuel-neutral tools to explore and analyze equipment and practices, energy-saving strategies, and other operational variables to ensure meaningful performance, financial, and environmental benefits.

  3. A new design of automatic vertical drilling tool

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2015-09-01

    Full Text Available In order to effectively improve penetration rates and enhance wellbore quality for vertical wells, a new Automatic Vertical Drilling Tool (AVDT based on Eccentric Braced Structure (EBS is designed. Applying operating principle of rotary steering drilling, AVDT adds offset gravity block automatic induction inclination mechanism. When hole straightening happens, tools take essentric moment to be produced by gravity of offset gravity lock to control the bearing of guide force, so that well straightening is achieved. The normal tool's size of the AVDT is designed as 215.9 mm,other major components' sizes are worked out by the result of theoretical analysis, including the offset angle of EBS. This paper aims to introduce the structure, operating principle, theoretical analysis and describe the key components' parameters setting of the AVDT.

  4. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  5. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  6. Basic performance metrics of in-line inspection tools

    Energy Technology Data Exchange (ETDEWEB)

    Timashev, Sviatoslav A. [Russian Academy of Sciences (Russian Federation). Ural Branch. Science and Engineering Center

    2003-07-01

    The paper discusses current possibilities and drawbacks of in-line inspection (ILI) in detecting, identifying, locating and sizing of all types of defects in oil and gas pipelines. A full set of consistent and universal ILI tool performance metrics is constructed. A holistic methodology that extracts maximum value from the ILI measurements in defect detecting, locating, identifying, sizing and verifying the results of ILI is presented. The outlined approach is being implemented as a software component of a multi-purpose HR MFL ILI tool and is proposed for the new API 1163 ILI Qualification Standard. (author)

  7. Questioning context: a set of interdisciplinary questions for investigating contextual factors affecting health decision making

    Science.gov (United States)

    Charise, Andrea; Witteman, Holly; Whyte, Sarah; Sutton, Erica J.; Bender, Jacqueline L.; Massimi, Michael; Stephens, Lindsay; Evans, Joshua; Logie, Carmen; Mirza, Raza M.; Elf, Marie

    2011-01-01

    Abstract Objective  To combine insights from multiple disciplines into a set of questions that can be used to investigate contextual factors affecting health decision making. Background  Decision‐making processes and outcomes may be shaped by a range of non‐medical or ‘contextual’ factors particular to an individual including social, economic, political, geographical and institutional conditions. Research concerning contextual factors occurs across many disciplines and theoretical domains, but few conceptual tools have attempted to integrate and translate this wide‐ranging research for health decision‐making purposes. Methods  To formulate this tool we employed an iterative, collaborative process of scenario development and question generation. Five hypothetical health decision‐making scenarios (preventative, screening, curative, supportive and palliative) were developed and used to generate a set of exploratory questions that aim to highlight potential contextual factors across a range of health decisions. Findings  We present an exploratory tool consisting of questions organized into four thematic domains – Bodies, Technologies, Place and Work (BTPW) – articulating wide‐ranging contextual factors relevant to health decision making. The BTPW tool encompasses health‐related scholarship and research from a range of disciplines pertinent to health decision making, and identifies concrete points of intersection between its four thematic domains. Examples of the practical application of the questions are also provided. Conclusions  These exploratory questions provide an interdisciplinary toolkit for identifying the complex contextual factors affecting decision making. The set of questions comprised by the BTPW tool may be applied wholly or partially in the context of clinical practice, policy development and health‐related research. PMID:21029277

  8. The effectiveness of environment assessment tools to guide refurbishment of Australian residential aged care facilities: A systematic review.

    Science.gov (United States)

    Neylon, Samantha; Bulsara, Caroline; Hill, Anne-Marie

    2017-06-01

    To determine applicability of environment assessment tools in guiding minor refurbishments of Australian residential aged care facilities. Studies conducted in residential aged care settings using assessment tools which address the physical environment were eligible for inclusion in a systematic review. Given these studies are limited, tools which have not yet been utilised in research settings were also included. Tools were analysed using a critical appraisal screen. Forty-three publications met the inclusion criteria. Ten environment assessment tools were identified, of which four addressed all seven minor refurbishment domains of lighting, colour and contrast, sound, flooring, furniture, signage and way finding. Only one had undergone reliability and validity testing. There are four tools which may be suitable to use for minor refurbishment of Australian residential aged care facilities. Data on their reliability, validity and quality are limited. © 2017 AJA Inc.

  9. Tools for Understanding Identity

    Energy Technology Data Exchange (ETDEWEB)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael; Hodges, Duncan; Kim, Dee DH; Love, Oriana J.; Nurse, Jason R.; Pike, William A.; Scholtz, Jean

    2013-12-28

    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical model for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take

  10. PENGEMBANGAN PERANGKAT PEMBELAJARAN TEMATIK BERVISI SETS BERKARAKTER PEDULI LINGKUNGAN

    Directory of Open Access Journals (Sweden)

    Dwi Nur Heni

    2015-08-01

    Full Text Available Tujuan penelitian ini adalah mengembangkan perangkat pembelajaran pembelajaran tematik bervisi SETS berkarakter peduli lingkungan yang valid, efektif, dan praktis. Jenis penelitian adalah penelitian dan pengembangan (R&D. Hasil Pengembangan berupa 1 Silabus, 2 Rencana Pelaksanaan Pembelajaran, 4 Lembar Kegiatan Peserta Didik, 5 Bahan ajar, dan 6 Alat evaluasi. Hasil validasi dari pakar menunjukkan bahwa perangkat yang dikembangkan layak untuk digunakan. Perangkat pembelajaran yang dikembangkan memenuhi kriteria efektif. Hasil belajar siswa kelas eksperimen berdasarkan pengujian one sample t-test diperoleh thitung 5,96 > t table 1,729 , yakni rata-rata skor tes peserta didik kelas eksperimen mencapai 82 yang masih lebih dari skor tes peserta didik kelas kontrol sebesar 65. Perhitungan N- menunjukkan nilai 0,48.dengan kriteria sedang. Guru memberikan respons positif 14 dari 16 indikator pertanyaan  dan  jumlah respons peserta didik sebesar 329 kategori sangat baik. Dapat disimpulkan perangkat pembelajaran yang dikembangkan memenuhi kriteria valid, efektif dan praktis.The aim of this research is to develop thematic learning tools that feature visionary SETS, to obtain valid, effective, and practical learning tools. The type of this research is research and development. Development product in form of 1 syllabus, 2 lesson plan, 3 student activity sheet, 4 teaching materials, and 6 evaluation sheet. Validation result showed that developed learning tools is feasible for use. The developed learning tools had met the effective criteria.  Shown that student activities in the good and very good categories, student learning result in experimental group is better than in the control group, based on one sample t-test it was obtained  t count 5,96 > t table 1,729, the average of student scores is 82 that was better than average of student scores in the control class which 65. N calculation showed 0,48 with a moderate criteria. Teacher give

  11. Approximating the Pareto set of multiobjective linear programs via robust optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    We consider problems with multiple linear objectives and linear constraints and use adjustable robust optimization and polynomial optimization as tools to approximate the Pareto set with polynomials of arbitrarily large degree. The main difference with existing techniques is that we optimize a

  12. An intelligent tool for activity data collection.

    Science.gov (United States)

    Sarkar, A M Jehad

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.

  13. Goal setting as an outcome measure: A systematic review.

    Science.gov (United States)

    Hurn, Jane; Kneebone, Ian; Cropley, Mark

    2006-09-01

    Goal achievement has been considered to be an important measure of outcome by clinicians working with patients in physical and neurological rehabilitation settings. This systematic review was undertaken to examine the reliability, validity and sensitivity of goal setting and goal attainment scaling approaches when used with working age and older people. To review the reliability, validity and sensitivity of both goal setting and goal attainment scaling when employed as an outcome measure within a physical and neurological working age and older person rehabilitation environment, by examining the research literature covering the 36 years since goal-setting theory was proposed. Data sources included a computer-aided literature search of published studies examining the reliability, validity and sensitivity of goal setting/goal attainment scaling, with further references sourced from articles obtained through this process. There is strong evidence for the reliability, validity and sensitivity of goal attainment scaling. Empirical support was found for the validity of goal setting but research demonstrating its reliability and sensitivity is limited. Goal attainment scaling appears to be a sound measure for use in physical rehabilitation settings with working age and older people. Further work needs to be carried out with goal setting to establish its reliability and sensitivity as a measurement tool.

  14. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    Energy Technology Data Exchange (ETDEWEB)

    HAYENGA, J.L.

    2006-12-19

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements.

  15. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    International Nuclear Information System (INIS)

    HAYENGA, J.L.

    2006-01-01

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements

  16. ColorTree: a batch customization tool for phylogenic trees.

    Science.gov (United States)

    Chen, Wei-Hua; Lercher, Martin J

    2009-07-31

    Genome sequencing projects and comparative genomics studies typically aim to trace the evolutionary history of large gene sets, often requiring human inspection of hundreds of phylogenetic trees. If trees are checked for compatibility with an explicit null hypothesis (e.g., the monophyly of certain groups), this daunting task is greatly facilitated by an appropriate coloring scheme. In this note, we introduce ColorTree, a simple yet powerful batch customization tool for phylogenic trees. Based on pattern matching rules, ColorTree applies a set of customizations to an input tree file, e.g., coloring labels or branches. The customized trees are saved to an output file, which can then be viewed and further edited by Dendroscope (a freely available tree viewer). ColorTree runs on any Perl installation as a stand-alone command line tool, and its application can thus be easily automated. This way, hundreds of phylogenic trees can be customized for easy visual inspection in a matter of minutes. ColorTree allows efficient and flexible visual customization of large tree sets through the application of a user-supplied configuration file to multiple tree files.

  17. A tool for conditions tag management in ATLAS

    International Nuclear Information System (INIS)

    Sharmazanashvili, A; Batiashvili, G; Gvaberidze, G; Shekriladze, L; Formica, A

    2014-01-01

    ATLAS Conditions data include about 2 TB in a relational database and 400 GB of files referenced from the database. Conditions data is entered and retrieved using COOL, the API for accessing data in the LCG Conditions Database infrastructure. It is managed using an ATLAS-customized python based tool set. Conditions data are required for every reconstruction and simulation job, so access to them is crucial for all aspects of ATLAS data taking and analysis, as well as by preceding tasks to derive optimal corrections to reconstruction. Optimized sets of conditions for processing are accomplished using strict version control on those conditions: a process which assigns COOL Tags to sets of conditions, and then unifies those conditions over data-taking intervals into a COOL Global Tag. This Global Tag identifies the set of conditions used to process data so that the underlying conditions can be uniquely identified with 100% reproducibility should the processing be executed again. Understanding shifts in the underlying conditions from one tag to another and ensuring interval completeness for all detectors for a set of runs to be processed is a complex task, requiring tools beyond the above mentioned python utilities. Therefore, a JavaScript /PHP based utility called the Conditions Tag Browser (CTB) has been developed. CTB gives detector and conditions experts the possibility to navigate through the different databases and COOL folders; explore the content of given tags and the differences between them, as well as their extent in time; visualize the content of channels associated with leaf tags. This report describes the structure and PHP/ JavaScript classes of functions of the CTB.

  18. Processes, Performance Drivers and ICT Tools in Human Resources Management

    OpenAIRE

    Oškrdal Václav; Pavlíček Antonín; Jelínková Petra

    2011-01-01

    This article presents an insight to processes, performance drivers and ICT tools in human resources (HR) management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes ...

  19. Quack: A quality assurance tool for high throughput sequence data.

    Science.gov (United States)

    Thrash, Adam; Arick, Mark; Peterson, Daniel G

    2018-05-01

    The quality of data generated by high-throughput DNA sequencing tools must be rapidly assessed in order to determine how useful the data may be in making biological discoveries; higher quality data leads to more confident results and conclusions. Due to the ever-increasing size of data sets and the importance of rapid quality assessment, tools that analyze sequencing data should quickly produce easily interpretable graphics. Quack addresses these issues by generating information-dense visualizations from FASTQ files at a speed far surpassing other publicly available quality assurance tools in a manner independent of sequencing technology. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Tools for man-machine interface development in accelerator control applications

    International Nuclear Information System (INIS)

    Kopylov, L.; Mikhev, M.; Trofimov, N.; Yurpalov, V.

    1994-01-01

    For the UNK Project a development of the Accelerator Control Applications is in the progress. These applications will use a specific Graphical User Interface for data presentation and accelerator parameter management. A number of tools have been developed based on the Motif Tool Kit. They contain a set of problem oriented screen templates and libraries. Using these tools, full scale prototype applications of the UNK Tune and Orbit measurement and correction were developed and are described, as examples. A subset of these allows the creation of the synoptic control screens from the Autocad pictures files and Oracle DB equipment descriptions. The basic concepts and a few application examples are presented. ((orig.))