WorldWideScience

Sample records for saphire tool set

  1. The atmosphere simulation chamber SAPHIR: a tool for the investigation of photochemistry.

    Science.gov (United States)

    Brauers, T.; Bohn, B.; Johnen, F.-J.; Rohrer, R.; Rodriguez Bares, S.; Tillmann, R.; Wahner, A.

    2003-04-01

    On the campus of the Forschungszentrum Jülich we constructed SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) which was accomplished in fall 2001. The chamber consists of a 280-m^3 double-wall Teflon bag of cylindrical shape that is held by a steel frame. Typically 75% of the outside actinic flux (290~nm~--~420~nm) is available inside the chamber. A louvre system allows switching between full sun light and dark within 40 s giving the opportunity to study relaxation processes of the photo chemical system. The SAPHIR chamber is equipped with a comprehensive set of sensitive instruments including the measurements of OH, HO_2, CO, hydrocarbons, aldehydes, nitrogen-oxides and solar radiation. Moreover, the modular concept of SAPHIR allows fast and flexible integration of new instruments and techniques. In this paper we will show the unique and new features of the SAPHIR chamber, namely the clean air supply and high purity water vapor supply providing a wide range of trace gas concentrations being accessible through the experiments. We will also present examples from the first year of SAPHIR experiment showing the scope of application from high quality instrument inter-comparison and kinetic studies to the simulation of complex mixtures of trace gases at ambient concentrations.

  2. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  3. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE), Version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Hoffman, C.L.

    1995-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Graphical Evaluation Module (GEM) is a special application tool designed for evaluation of operational occurrences using the Accident Sequence Precursor (ASP) program methods. GEM provides the capability for an analyst to quickly and easily perform conditional core damage probability (CCDP) calculations. The analyst can then use the CCDP calculations to determine if the occurrence of an initiating event or a condition adversely impacts safety. It uses models and data developed in the SAPHIRE specially for the ASP program. GEM requires more data than that normally provided in SAPHIRE and will not perform properly with other models or data bases. This is the first release of GEM and the developers of GEM welcome user comments and feedback that will generate ideas for improvements to future versions. GEM is designated as version 5.0 to track GEM codes along with the other SAPHIRE codes as the GEM relies on the same, shared database structure

  4. SAPHIR, how it ended

    International Nuclear Information System (INIS)

    Brogli, R.; Hammer, J.; Wiezel, L.; Christen, R.; Heyck, H.; Lehmann, E.

    1995-01-01

    On May 16th, 1994, PSI decided to discontinue its efforts to retrofit the SAPHIR reactor for operation at 10 MW. This decision was made because the effort and time for the retrofit work in progress had proven to be more complex than was anticipated. In view of the start-up of the new spallation-neutron source SINQ in 1996, the useful operating time between the eventual restart of SAPHIR and the start-up of SINQ became less than two years, which was regarded by PSI as too short a period to warrant the large retrofit effort. Following the decision of PSI not to re-use SAPHIR as a neutron source, several options for the further utilization of the facility were open. However, none of them appeared promising in comparison with other possibilities; it was therefore decided that SAPHIR should be decommissioned. A concerted effort was initiated to consolidate the nuclear and conventional safety for the post-operational period. (author) 3 figs., 3 tab

  5. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  6. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  7. SAPHIRE6.64, System Analysis Programs for Hands-on Integrated Reliability

    International Nuclear Information System (INIS)

    2001-01-01

    1 - Description of program or function: SAPHIRE is a collection of programs developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA) primarily for nuclear power plants. The programs included in this suite are the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P and ID (FEP) editors. Previously these programs were released as separate packages. These programs include functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to perform uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. 2 - Methods: SAPHIRE is written in MODULA-2 and uses an integrated commercial graphics package to interactively construct and edit fault trees. The fault tree solving methods used are industry recognized top down algorithms. For quantification, the program uses standard methods to propagate the failure information through the generated cut sets. SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE which automates the process for evaluating operational events at commercial nuclear power plants. Using GEM an analyst can estimate the risk associated with operational events (that is, perform a Level 1, Level 2, and Level 3 analysis for operational events) in a very efficient and expeditious manner. This on-line reference guide will

  8. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE)

    International Nuclear Information System (INIS)

    C. L. Smith

    2006-01-01

    available in SAPHIRE and presents general instructions for using the software. Section 1 presents SAPHIRE's historical evolution and summarizes its capabilities. Section 2 presents instructions for installing and using the code. Section 3 explains the database structure used in SAPHIRE and discusses database concepts. Section 4 explains how PRA data (event frequencies, human error probabilities, etc.) can be generated and manipulated using ''change sets''. Section 5 deals with fault tree operations, including constructing, editing, solving, and displaying results. Section 6 presents operations associated with event trees, including rule application for event tree linking, partitioning, and editing sequences. Section 7 presents how accident sequences are generated, solved, quantified, and analyzed. Section 8 discusses the functions available for performing end state analysis. Section 9 explains how to modify data stored in a SAPHIRE database. Section 10 illustrates how to generate and customize reports. Section 11 covers SAPHIRE utility options to perform routine functions such as defining constant values, recovering databases, and loading data from external sources. Section 12 provides an overview of GEM's features and capabilities. Finally, Section 13 summarizes SAPHIRE's quality assurance process

  9. SAPHIRE 8 Volume 1 - Overview and Summary

    International Nuclear Information System (INIS)

    Smith, C.L.; Wood, S.T.

    2011-01-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system's response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE 8 can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which leads to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for managing models such as flooding and fire. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). In SAPHIRE 8, the act of creating a model has been separated from the analysis of that model in order to improve the quality of both the model (e.g., by avoiding inadvertent changes) and the analysis. Consequently, in SAPHIRE 8, the analysis of models is performed by using what are called Workspaces. Currently, there are Workspaces for three types of analyses: (1) the NRC's Accident Sequence Precursor program, where the workspace is called 'Events and Condition Assessment (ECA);' (2) the NRC's Significance Determination Process (SDP); and

  10. SAPHIRE 8 Volume 6 - Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 8 is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows™ operating system. SAPHIRE 8 is funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 8, what constitutes its parts, and limitations of those processes. In addition, this document describes the Independent Verification and Validation that was conducted for Version 8 as part of an overall QA process.

  11. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  12. Strangeness photoproduction with the SAPHIR-detector

    International Nuclear Information System (INIS)

    Merkel, H.

    1993-12-01

    At the ELSA facility at Bonn a photon beam with a high duty cycle up to energies of 3.3 GeV is available. In this energy range the large solid angle detector SAPHIR enables us to investigate the strangeness photoproduction starting from threshold. SAPHIR has already achieved results for the reactions γ+p→K + +Λ and γ+p→K + +Σ 0 . This work investigates the possibilities to measure the related reactions γ+n→K 0 +Λ and γ+n→K 0 +Σ 0 at a deuteron target and to measure the reaction γ+p→K 0 +Σ + at a proton target. For the first time the Σ + polarisation has been measured. With an cross section 10 times smaller compared to the kaon hyperon reactions, the photoproduction of the Φ(1020) meson can be investigated with the SAPHIR detector too. First reconstructed events are shown. (orig.)

  13. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  14. Assimilation of SAPHIR radiance: impact on hyperspectral radiances in 4D-VAR

    Science.gov (United States)

    Indira Rani, S.; Doherty, Amy; Atkinson, Nigel; Bell, William; Newman, Stuart; Renshaw, Richard; George, John P.; Rajagopal, E. N.

    2016-04-01

    Assimilation of a new observation dataset in an NWP system may affect the quality of an existing observation data set against the model background (short forecast), which in-turn influence the use of an existing observation in the NWP system. Effect of the use of one data set on the use of another data set can be quantified as positive, negative or neutral. Impact of the addition of new dataset is defined as positive if the number of assimilated observations of an existing type of observation increases, and bias and standard deviation decreases compared to the control (without the new dataset) experiment. Recently a new dataset, Megha Tropiques SAPHIR radiances, which provides atmospheric humidity information, is added in the Unified Model 4D-VAR assimilation system. In this paper we discuss the impact of SAPHIR on the assimilation of hyper-spectral radiances like AIRS, IASI and CrIS. Though SAPHIR is a Microwave instrument, its impact can be clearly seen in the use of hyper-spectral radiances in the 4D-VAR data assimilation systems in addition to other Microwave and InfraRed observation. SAPHIR assimilation decreased the standard deviation of the spectral channels of wave number from 650 -1600 cm-1 in all the three hyperspectral radiances. Similar impact on the hyperspectral radiances can be seen due to the assimilation of other Microwave radiances like from AMSR2 and SSMIS Imager.

  15. SAPHIR, a simulator for engineering and training on N4-type nuclear power plants

    International Nuclear Information System (INIS)

    Vovan, C.

    1999-01-01

    SAPHIR, the new simulator developed by FRAMATOME, has been designed to be a convenient tool for engineering and training for different types of nuclear power plants. Its first application is for the French 'N4' four-loop 1500MWe PWR. The basic features of SAPHIR are: (1) Use of advanced codes for modelling He primary and secondary systems' including an axial steam generator model, (2) Use of a simulation workshop containing different tools for modelling fluid, electrical, instrument and control networks, (3) A Man-Machine Interface designed for an easy and convivial use which can simulate the different computerized control consoles of the 'N4' control room. This paper outlines features and capabilities of this tool, both for engineering and training purposes. (author)

  16. Design and construction of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schwille, W.J.; Bockhorst, M.; Burbach, G.; Burgwinkel, R.; Empt, J.; Guse, B.; Haas, K.M.; Hannappel, J.; Heinloth, K.; Hey, T.; Honscheid, K.; Jahnen, T.; Jakob, H.P.; Joepen, N.; Juengst, H.; Kirch, U.; Klein, F.J.; Kostrewa, D.; Lindemann, L.; Link, J.; Manns, J.; Menze, D.; Merkel, H.; Merkel, R.; Neuerburg, W.; Paul, E.; Ploetzke, R.; Schenk, U.; Schmidt, S.; Scholmann, J.; Schuetz, P.; Schultz-Coulon, H.C.; Schweitzer, M.; Tran, M.Q.; Vogl, W.; Wedemeyer, R.; Wehnes, F.; Wisskirchen, J.; Wolf, A.

    1994-01-01

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  17. Design and construction of the SAPHIR detector

    Energy Technology Data Exchange (ETDEWEB)

    Schwille, W.J. (Bonn Univ. (Germany). Physikalisches Inst.); Bockhorst, M. (Bonn Univ. (Germany). Physikalisches Inst.); Burbach, G. (Bonn Univ. (Germany). Physikalisches Inst.); Burgwinkel, R. (Bonn Univ. (Germany). Physikalisches Inst.); Empt, J. (Bonn Univ. (Germany). Physikalisches Inst.); Guse, B. (Bonn Univ. (Germany). Physikalisches Inst.); Haas, K.M. (Bonn Univ. (Germany). Physikalisches Inst.); Hannappel, J. (Bonn Univ. (Germany). Physikalisches Inst.); Heinloth, K. (Bonn Univ. (Germany). Physikalisches Inst.); Hey, T. (Bonn Univ. (Germany). Physikalisches Inst.); Honscheid, K. (Bonn Univ. (Germany). Physikalisches Inst.); Jahnen, T. (Bonn Univ. (Germany). Physikalisches Inst.); Jakob, H.P. (Bonn Univ. (Germany). Physikalisches Inst.); Joepen, N. (Bonn Univ. (Germany). Physikalisches Inst.); Juengst, H. (Bonn Univ. (Germany). Physikalisches Inst.); Kirch, U. (Bonn Univ. (Germany). Physikalisches Inst.); Klein, F.J. (Bonn Univ. (Germany). Physikalisches Inst.)

    1994-05-15

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  18. The capabilities and applications of the saphire 5.0 safety assessment software

    International Nuclear Information System (INIS)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1994-01-01

    The System Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. The programs in this suite include: Models and Results Data Base (MAR-D) software, Integrated Reliability and Risk Analysis System (IRRAS) software, System Analysis and Risk Assessment (SARA) software, and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Each of these programs performs a specific function in taking a PRA from the conceptual state all the way to publication. This paper provides an overview of the features and capabilities provided in version 5.0 of this software system. Some major new features include the ability to store unlimited cut sets, the ability to perform location transformations, the ability to perform seismic analysis, the ability to perform automated rule based recovery analysis and end state cut set partitioning, the ability to perform end state analysis, a new alphanumeric fault tree editor, and a new alphanumeric event tree editor. Many enhancements and improvements to the user interface as well as a significant reduction in the time required to perform an analysis are included in version 5.0. These new features and capabilities provide a powerful set of PC based PRA analysis tools

  19. The alarm system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schultz-Coulon, H.C.

    1993-06-01

    In order to obtain an effective control of the different detector components an alarm system was built and implemented into the data acquisition system of the SAPHIR experiment. It provides an easy way of indicating errors by either adequate library calls or an appropriate hardware signal, both leading to an active alarm. This allows to react directly to any error detected by one of the specific control systems. In addition for selected kinds of errors the data run can be stopped automatically. Concept and construction of this system are described and some examples for its application are given. (orig.)

  20. Development of a model-independent evaluation of photon-deuteron reactions for the SAPHIR detector

    International Nuclear Information System (INIS)

    Wolf, A.

    1993-01-01

    The SAPHIR detector measures photon induced reactions with many particles in the final state. Thus a detailed investigation of those processes at photon energies between 0.4 and 3.3 GeV is possible. The interpretation of the distribution of the sample of events, which SAPHIR is able to reconstruct, has to be done after a correction of influences induced by the detector acceptance. In this work a model independent method of correcting and analysing the data is discussed. The implementation of the basic tools of this analysis is described and first tests with simulated and real events are performed. SAPHIR uses a time-of-flight system for the identification of particles. This work describes the structure of a program library, which supports an easy way of decoding the digitizations of this system (including calibration of the hardware) and obtaining the flight time for a particle in a event. The necessary step for calibrating the system are outlined, too. (orig.)

  1. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  2. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave ...

    Indian Academy of Sciences (India)

    used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. ... All-sky radiance simulation; Megha tropiques; microwave SAPHIR sensor; radiative transfer; data ... versions of these non-linear processes (Ohring and.

  3. SAPHIRE 8 Volume 3 - Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. Vedros; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). This reference guide will introduce the SAPHIRE Version 8.0 software. A brief discussion of the purpose and history of the software is included along with general information such as installation instructions, starting and stopping the program, and some pointers on how to get around inside the program. Next, database concepts and structure are discussed. Following that discussion are nine sections, one for each of the menu options on the SAPHIRE main menu, wherein the purpose and general capabilities for each option are

  4. Nucleonic calculations for possible irradiation experiments in SAPHIR

    International Nuclear Information System (INIS)

    Caro, M.; Pelloni, S.

    1990-01-01

    Accurate two-dimensional calculations show that a 'neutronic environment' exists in the SAPHIR reactor at the Paul Scherrer Institute (PSI) to simulate the inner surface of a given trepan of the Gundremmingen reactor. Neutron fluences and DPA rates were calculated at two positions in SAPHIR using the modern codes and nuclear data (from JEF-1). A particular region of the reactor can be found in which fluences and DPA rates agree well within a few percent with the Gundremmingen reference case. (author) 13 figs., 4 tabs., 18 refs

  5. SAPhIR: a fission-fragment detector

    International Nuclear Information System (INIS)

    Theisen, Ch.; Gautherin, C.; Houry, M.; Korten, W.; Le Coz, Y.; Lucas, R.; Barreau, G.; Doan, T. P.; Belier, G.; Meot, V.; Ethvignot, Th.; Cahan, B.; Le Coguie, A.; Coppolani, X.; Delaitre, B.; Le Bourlout, P.; Legou, Ph.; Maillard, O.; Durand, G.; Bouillac, A.

    1998-01-01

    SAPhIR is the acronym for S a clay A q uitaine P ho tovoltaic cells for I s omer R e search. It consists of solar cells, used for fission-fragment detection. It is a collaboration between 3 laboratories: CEA Saclay, CENBG Bordeaux and CEA Bruyeres le Chatel. The coupling of a highly efficient fission-fragment detector like SAPhIR with EUROBALL will provide new insights in the study of very deformed nuclear matter and in the spectroscopy of neutron-rich nuclei

  6. The central drift chamber of the SAPHIR detector - implementation into the experiment and study of its properties

    International Nuclear Information System (INIS)

    Haas, K.M.

    1992-01-01

    At the Bonn accelerator facility ELSA the large solid angle detector SAPHIR was built for the investigation of photon induced reactions. A main component of SAPHIR is the central drift chamber (CDC) matching the magneto gap of 1m 3 . The diameter of the in total 1828 hexagonal drift cells is about 18 mm. The subject of this paper is the implementation of the CDC in the experiment. After the description of the hardware follows the presentation of the software tools for filtering and monitoring the data, which have been developed and tested. An algorithm for extracting the space time relationship is presented. The properties of the chamber with an improved gas mixture (Helium/Neon/Isobutane8 21.25:63.75:15) have been investigated. A spatial resolution of about 200 μm was achieved. The efficiency of the chamber is 97% at a tagged photon of 5x10 4 per second crossing the chamber. (orig.) [de

  7. A new plant chamber facility PLUS coupled to the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2015-11-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been build and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees are mixed with synthetic air and are transferred to the SAPHIR chamber where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOC) can be studied in detail. In PLUS all important enviromental parameters (e.g. temperature, PAR, soil RH etc.) are well-controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leafes of the plants is constructed such that gases are exposed to FEP Teflon film and other Teflon surfaces only to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 LED panels which have an emission strength up to 800 μmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOC) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light and temperature dependent BVOC emissions are studied using six Quercus Ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus Ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental set up and the utility of the newly added plant chamber.

  8. The trigger and data acquisition system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Honscheid, K.

    1988-10-01

    At present SAPHIR, a new experimental facility for medium energy physics is under construction at the Bonn electron accelerator ELSA (energy ≤ 3.5 GeV, duty cycle ≅ 100%). SAPHIR combines a large solid angle coverage with a tagging system and is therefore suited to investigate reactions with multi-particle final states. Structure and function of the multi-stage trigger system, which is used to select such processes, are described in this paper. With this system the trigger decision can be based on the number of charged particles as well as on the number of neutral particle detected. Several VMEbus modules have been developed, using memory look-up tables to make fast trigger decisions possible. In order to determine the number of neutral particles from the cluster distribution in the electromagnetic calorimeter some ideas of cellular had to be added. The system has a modular structure, so it can easily be extended. In the second part of this thesis the SAPHIR data acquisition system is discussed. It consists of a multiprocessor system with the VIP microcomputer as central element. The VIP is a VMEbus modul optimized for a multiprocessor environment. Its description as well as that of the other VMEbus boards developed for the SAPHIR online system can be found in this paper. As a basis for software development the operating system SOS is supplied. With SOS it is possible to write programs independent of the actual hardware configuration and so the complicated multiprocessor environment is hidden. To the user the system looks like a simple multi-tasking system. SOS is not restricted to the VIPs but can also be installed on computers of the VAX family, so that efficient mixed configurations are possible. The SAPHIR online system, based on the VIP microcomputer and the SOS operating system, is presented in the last part of this paper. This includes the read-out system, the monitoring of the different components etc. (orig./HSI) [de

  9. Verification and validation of the SAPHIRE Version 4.0 PRA software package

    International Nuclear Information System (INIS)

    Bolander, T.W.; Calley, M.B.; Capps, E.L.

    1994-02-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE). SAPHIRE is a set of four computer programs that the Nuclear Regulatory Commission (NRC) developed to perform probabilistic risk assessments (PRAs). These programs allow an analyst to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs included in this set are Integrated Reliability and Risk Analysis System (IRRAS), System Analysis and Risk Assessment (SARA), Models and Results Database (MAR-D), and Fault Tree/Event Tree/Piping and Instrumentation Diagram (FEP) graphical editor. The V ampersand V steps included a V ampersand V plan to describe the process and criteria by which the V ampersand V would be performed; a software requirements documentation review to determine the correctness, completeness, and traceability of the requirements; a user survey to determine the usefulness of the user documentation, identification and testing of vital and non-vital features, and documentation of the test results

  10. The graphics system and the data saving for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Albold, D.

    1990-08-01

    Important extensions have been made to the data acquisition system SOS for the SAPHIR experiment at the Bonn ELSA facilities. As support for various online-programs, controlling components of the detector, a graphic system for presenting data was developed. This enables any program in the system to use all graphic devices. Main component is a program serving requests for presentation on a 19 inch color monitor. Window-technique allows a presentation of several graphics on one screen. Equipped with a trackball and using menus, this is an easy to use and powerful tool in controlling the experiment. Other important extensions concern data storage. A huge amount of event data can be stored on 8 mm cassettes by the program Eventsaver. This program can be controlled by a component of the SAPHIR-Online SOL running on a VAX-Computer and using windows and menus. The smaller amount of data, containing parameters and programs, which should be accessible within a small period of time, can be stored on a magnetic disk. A program supporting a file-structure for access to this disk is described. (orig./HSI) [de

  11. The scintillation counter system at the SAPHIR detector

    International Nuclear Information System (INIS)

    Bour, D.

    1989-10-01

    The scintillation-counters system of the SAPHIR-detector at the stretcher accelerator ELSA in Bonn consists of 64 counters. It supplies a fast hadronic trigger and is utilizised for the particle identification by time of flight measurements. Prototypes of the counters (340x21.25 x 6.0 cm 3 ) had been tested. The contribution to the resolution of the time of flight measurement was measured to σ=125 ps, the effective light velocity to 17.5 ns/cm and the attenuation length of 7.8 m. A pion kaon separation is possible up to a momentum of 1 GeV/c with time of flight measurement. With the first photon-beam at SAPHIR the counters were tested, first triggers were obtained and evaluated. (orig.) [de

  12. Construction and calibration studies of the SAPHIR scintillation counters

    International Nuclear Information System (INIS)

    Kostrewa, D.

    1988-03-01

    For the scintillation counter system of the SAPHIR detector at the stretcher ring ELSA in Bonn 50 time of flight counters and 12 trigger counters have been built. Each of them has two photomultipliers, one at each side. A laser calibration system with a pulsed nitrogen laser as central light source to monitor these photomultipliers has been optimized. It was used to adjust the photomultipliers and to test their long and short time instabilities. (orig.)

  13. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  14. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume is the reference manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. The SARA database contains PRA data primarily for the dominant accident sequences of a family and descriptive information about the family including event trees, fault trees, and system model diagrams. The number of facility databases that can be accessed is limited only by the amount of disk storage available. To simulate changes to family systems, SARA users change the failure rates of initiating and basic events and/or modify the structure of the cut sets that make up the event trees, fault trees, and systems. The user then evaluates the effects of these changes through the recalculation of the resultant accident sequence probabilities and importance measures. The results are displayed in tables and graphs that may be printed for reports. A preliminary version of the SARA program was completed in August 1985 and has undergone several updates in response to user suggestions and to maintain compatibility with the other SAPHIRE programs. Version 5.0 of SARA provides the same capability as earlier versions and adds the ability to process unlimited cut sets; display fire, flood, and seismic data; and perform more powerful cut set editing

  15. Track finding and track reconstruction in the internal forward drift chamber of SAPHIR

    International Nuclear Information System (INIS)

    Umlauf, G.

    1993-03-01

    A track finding algorithm has been developed for the inner forward drift chamber of the SAPHIR detector (at ELSA in Bonn) using the Principal Components Analysis as a tool for interpolating track coordinates. The drift chamber consists of twelve planar layers with six different inclinations and is being operated in an inhomogenous magnetic field. The task of track finding is basicly split into a primary stage that defines track candidates without the use of drift-time information and a second stage that serves to verify the track candidate and to resolve the intrinsic left-right ambiguities of the drift chamber signals. Tracks with at most three missing signals can be found. (orig.) [de

  16. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  17. Evaluating online diagnostic decision support tools for the clinical setting.

    Science.gov (United States)

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  18. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  19. The photon detection system of the SAPHIR spectrometer

    International Nuclear Information System (INIS)

    Joepen, N.

    1990-09-01

    Worldwide a new generation of Electron Accelerators with energies below 5 GeV and a high duty cycle up to 100% is being built or planned. The first machine of this kind is ELSA, the Electron Stretcher and Accelerator, at the Physics Institute of Bonn University. Due to the high duty cycle of ELSA, experiments with tagged photon beams and a large angular acceptance become possible. At present SAPHIR, a new magnetic detector, especially layed out to detect multi-particle final states with good accuracy, is going into operation. Besides a large arrangement of drift chambers, for a good momentum resolution, and a trigger- and time-of-flight counter system, for particle identification, one of the main features of SAPHIR is a good photon detection capability. This is accomplished by a large electromagnetic calorimeter consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For the calorimeter a brass-gas-sandwich detector was developed. Its signal wires are strung perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A sample of nine shower counter modules was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/(E*√E(GeV)) = 13.55 ± 0.6% for a single module was achieved. The incident angle of the electrons was varied between 0 and 45 degrees. No significant change of energy resolution and linearity was observed. Combining the information from wire and cathode signals a position resolution (E = 1 GeV:Φ=0deg → σ = 15 mm, Φ=45deg → σ x = 19 mm) was reached. The second part of this paper gives a description of the shower counter arrangement in the SAPHIR detector. It requires a sophisticated control and calibration system, whose details are presented. Further on some aspects of the calorimeter calibration procedure are discussed

  20. Development of the software of the data taking system SOS for the SAPHIR experiment. Entwicklung der Software des Datennahmesystems SOS fuer das SAPHIR-Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.).

  1. The muon trigger of the SAPHIR shower detector

    International Nuclear Information System (INIS)

    Rufeger-Hurek, H.

    1989-12-01

    The muon trigger system of the SAPHIR shower counter consists of 4 scintillation counters. The total trigger rate of cosmic muons is about 55 Hz which is reduced to about 45 Hz by the selecting algorithms. This rate of clean muon events allows a simultaneous monitoring of the whole electronics system and the calibration of the gas sandwich detector by measuring the gas gain. The dependences of the signals on the geometry have been simulated with the help of a Monte Carlo program. The comparison of simulated and measured pulse heights shows that faults in the electronics as well as defects in the detector hardware, e.g., the HV system, or temperature effects, can be recognized at the level of a few percent. In addition the muon signals are used to determine the calibration factor for each cathode channel individually. (orig.) [de

  2. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.; Smith, C.L.; Rasmuson, D.M.

    1994-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  3. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    1995-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  4. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model; Simulation der Strahlfuehrung des SAPHIR-Experiments mittels eines Differentialgleichungsmodells

    Energy Technology Data Exchange (ETDEWEB)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.). [Deutsch] Diese Arbeit zeigt die numerische Simulation einer Strahlfuehrung mittels eines Differentialgleichungsmodells anhand der Strahlfuehrung vom Bonner ELSA-Beschleuniger zum SAPHIR-Experiment. Weiterhin wird eine Methode zur Gewinnung der Startwerte aus Strahlprofilmessungen diskutiert. (orig.).

  5. Mathematical tools for data mining set theory, partial orders, combinatorics

    CERN Document Server

    Simovici, Dan A

    2014-01-01

    Data mining essentially relies on several mathematical disciplines, many of which are presented in this second edition of this book. Topics include partially ordered sets, combinatorics, general topology, metric spaces, linear spaces, graph theory. To motivate the reader a significant number of applications of these mathematical tools are included ranging from association rules, clustering algorithms, classification, data constraints, logical data analysis, etc. The book is intended as a reference for researchers and graduate students. The current edition is a significant expansion of the firs

  6. Validation of the TRUST tool in a Greek perioperative setting.

    Science.gov (United States)

    Chatzea, Vasiliki-Eirini; Sifaki-Pistolla, Dimitra; Dey, Nilanjan; Melidoniotis, Evangelos

    2017-06-01

    The aim of this study was to translate, culturally adapt and validate the TRUST questionnaire in a Greek perioperative setting. The TRUST questionnaire assesses the relationship between trust and performance. The study assessed the levels of trust and performance in the surgery and anaesthesiology department during a very stressful period for Greece (economic crisis) and offered a user friendly and robust assessment tool. The study concludes that the Greek version of the TRUST questionnaire is a reliable and valid instrument for measuring team performance among Greek perioperative teams. Copyright the Association for Perioperative Practice.

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  8. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  9. Workplace wellness using online learning tools in a healthcare setting.

    Science.gov (United States)

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p online learning, indicating scope for development of further online packages relating to other important health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  11. Development of the software of the data taking system SOS for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.)

  12. Implementation of the FASTBUS data-acquisition system in the readout of the SAPHIR detector

    International Nuclear Information System (INIS)

    Empt, J.

    1993-12-01

    The magnetic detector SAPHIR is layed out to detect multiparticle final states with good acuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter was built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. For data acquisition of a major part of this calorimeter a modular FASTBUS system is used. In this report the FASTBUS system and its installation in the SAPHIR Online Program are described. (orig.)

  13. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  14. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  15. Independent Verification and Validation SAPHIRE Version 8 Final Report Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-04-01

    This report provides an evaluation of the SAPHIRE version 8 software product. SAPHIRE version 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach has been taken for the IV&V activities on each vital software object. IV&V and Software Quality Assurance (SQA) activities occur throughout the entire development life cycle and therefore, will be required through the full development of SAPHIRE version 8. Later phases of the software life cycle, the operation and maintenance phases, are not applicable in this effort since the IV&V is being done prior to releasing Version 8.

  16. Characterisation of the photolytic HONO-source in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    F. Rohrer

    2005-01-01

    Full Text Available HONO formation has been proposed as an important OH radical source in simulation chambers for more than two decades. Besides the heterogeneous HONO formation by the dark reaction of NO2 and adsorbed water, a photolytic source has been proposed to explain the elevated reactivity in simulation chamber experiments. However, the mechanism of the photolytic process is not well understood so far. As expected, production of HONO and NOx was also observed inside the new atmospheric simulation chamber SAPHIR under solar irradiation. This photolytic HONO and NOx formation was studied with a sensitive HONO instrument under reproducible controlled conditions at atmospheric concentrations of other trace gases. It is shown that the photolytic HONO source in the SAPHIR chamber is not caused by NO2 reactions and that it is the only direct NOy source under illuminated conditions. In addition, the photolysis of nitrate which was recently postulated for the observed photolytic HONO formation on snow, ground, and glass surfaces, can be excluded in the chamber. A photolytic HONO source at the surface of the chamber is proposed which is strongly dependent on humidity, on light intensity, and on temperature. An empirical function describes these dependencies and reproduces the observed HONO formation rates to within 10%. It is shown that the photolysis of HONO represents the dominant radical source in the SAPHIR chamber for typical tropospheric O3/H2O concentrations. For these conditions, the HONO concentrations inside SAPHIR are similar to recent observations in ambient air.

  17. Blended Learning Tools in Geosciences: A New Set of Online Tools to Help Students Master Skills

    Science.gov (United States)

    Cull, S.; Spohrer, J.; Natarajan, S.; Chin, M.

    2013-12-01

    In most geoscience courses, students are expected to develop specific skills. To master these skills, students need to practice them repeatedly. Unfortunately, few geosciences courses have enough class time to allow students sufficient in-class practice, nor enough instructor attention and time to provide fast feedback. To address this, we have developed an online tool called an Instant Feedback Practice (IFP). IFPs are low-risk, high-frequency exercises that allow students to practice skills repeatedly throughout a semester, both in class and at home. After class, students log onto a course management system (like Moodle or Blackboard), and click on that day's IFP exercise. The exercise might be visually identifying a set of minerals that they're practicing. After answering each question, the IFP tells them if they got it right or wrong. If they got it wrong, they try again until they get it right. There is no penalty - students receive the full score for finishing. The goal is low-stakes practice. By completing dozens of these practices throughout the semester, students have many, many opportunities to practice mineral identification with quick feedback. Students can also complete IFPs during class in groups and teams, with in-lab hand samples or specimens. IFPs can also be used to gauge student skill levels as the semester progresses, as they can be set up to provide the instructor feedback on specific skills or students. When IFPs were developed for and implemented in a majors-level mineralogy class, students reported that in-class and online IFPs were by far the most useful technique they used to master mineral hand sample identification. Final grades in the course were significantly higher than historical norms, supporting students' anecdotal assessment of the impact of IFPs on their learning.

  18. Track recognition in the central drift chamber of the SAPHIR detector at ELSA and first reconstruction of real tracks

    International Nuclear Information System (INIS)

    Korn, P.

    1991-02-01

    The FORTRAN program for pattern recognition in the central drift chamber of SAPHIR has been modified in order to find tracks with more than one missing wire signal and has been optimized in resolving the left/right ambiguities. The second part of this report deals with the reconstruction of some real tracks (γ → e + e - ), which were measured with SAPHIR. The efficiency of the central drift chamber and the space-to-drift time-relation are discussed. (orig.)

  19. Electronic Mail in Academic Settings: A Multipurpose Communications Tool.

    Science.gov (United States)

    D'Souza, Patricia Veasey

    1992-01-01

    Explores possible uses of electronic mail in three areas of the academic setting: instruction, research, and administration. Electronic mail is defined, the components needed to get started with electronic mail are discussed, and uses and benefits of electronic mail in diverse educational environments are suggested. (12 references) (DB)

  20. APMS: An Integrated Set of Tools for Measuring Safety

    Science.gov (United States)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through

  1. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0, technical reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Galyean, W.J.; Sattison, M.B.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume provides information on the principles used in the construction and operation of Version 5.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  2. A simulator tool set for evaluating HEVC/SHVC streaming

    Science.gov (United States)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  3. Tool set for distributed real-time machine control

    Science.gov (United States)

    Carrott, Andrew J.; Wright, Christopher D.; West, Andrew A.; Harrison, Robert; Weston, Richard H.

    1997-01-01

    Demands for increased control capabilities require next generation manufacturing machines to comprise intelligent building elements, physically located at the point where the control functionality is required. Networks of modular intelligent controllers are increasingly designed into manufacturing machines and usable standards are slowly emerging. To implement a control system using off-the-shelf intelligent devices from multi-vendor sources requires a number of well defined activities, including (a) the specification and selection of interoperable control system components, (b) device independent application programming and (c) device configuration, management, monitoring and control. This paper briefly discusses the support for the above machine lifecycle activities through the development of an integrated computing environment populated with an extendable software toolset. The toolset supports machine builder activities such as initial control logic specification, logic analysis, machine modeling, mechanical verification, application programming, automatic code generation, simulation/test, version control, distributed run-time support and documentation. The environment itself consists of system management tools and a distributed object-oriented database which provides storage for the outputs from machine lifecycle activities and specific target control solutions.

  4. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  5. TOPAS 2 - a high-resolution tagging system at the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Rappenecker, G.

    1989-02-01

    For the SAPHIR-arrangement in Bonn a high resolving tagging system has been developed achieving an energy resolution of 2 MeV, covering the range of (0.94-0.34) E 0 photon energy (1.0 GeV 0 2 , ArCH 4 and ArC 2 H 6 in concern of performance, clustersize and coincidence width. (orig.)

  6. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Models and Results Database (MAR-D) reference manual. Volume 8

    International Nuclear Information System (INIS)

    Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions

  7. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Directory of Open Access Journals (Sweden)

    A. W. Rollins

    2012-11-01

    Full Text Available N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focused on the investigation of potential interferences from, e.g., water vapour or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS instruments, 2 laser-induced fluorescence (LIF instruments. Datasets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values were larger than 0.96 for the entire datasets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference

  8. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    Science.gov (United States)

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  9. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2010-01-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytical methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBB\\-CEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO3, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  10. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0. Volume 5, Systems Analysis and Risk Assessment (SARA) tutorial manual

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs) primarily for nuclear power plants. This volume is the tutorial manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons is provided that guides the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis capabilities

  11. Simulation calculations on the construction of the energy-tagged photon beam as well as development and test of the side drift chambers of the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Jahnen, T.

    1990-01-01

    The SAPHIR-detector is built up at the continuous photon beam of the Electron Stretcher and Accelerator ELSA in Bonn. The equipment is designed for investigations of reactions with more then two particles in the final state and for photon energies up to 3.5 GeV. A tagging-system determines the energy of the Bremsstrahlung-photons and a set-up of five large driftchambers measures the tracks of the charged particles. This work describes a program which was used to develop the best design of the tagging-hodoscope. In a second part the tests of the planar side-chambers and their evaluation is described. These measurements were carried out to fix the gasfilling and the parameters of the best working point. It is shown, that the chambers can reach a resolution of σ≤200 μm. (orig.) [de

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 5.0: Data loading manual. Volume 10

    International Nuclear Information System (INIS)

    VanHorn, R.L.; Wolfram, L.M.; Fowler, R.D.; Beck, S.T.; Smith, C.L.

    1995-04-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) suite of programs can be used to organize and standardize in an electronic format information from probabilistic risk assessments or individual plant examinations. The Models and Results Database (MAR-D) program of the SAPHIRE suite serves as the repository for probabilistic risk assessment and individual plant examination data and information. This report demonstrates by examples the common electronic and manual methods used to load these types of data. It is not a stand alone document but references documents that contribute information relative to the data loading process. This document provides a more detailed discussion and instructions for using SAPHIRE 5.0 only when enough information on a specific topic is not provided by another available source

  13. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  14. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  15. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  16. A new plant chamber facility, PLUS, coupled to the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2016-03-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been built and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow-through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees is mixed with synthetic air and transferred to the SAPHIR chamber, where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOCs) can be studied in detail. In PLUS all important environmental parameters (e.g., temperature, photosynthetically active radiation (PAR), soil relative humidity (RH)) are well controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leaves of the plants is constructed such that gases are exposed to only fluorinated ethylene propylene (FEP) Teflon film and other Teflon surfaces to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 light-emitting diode (LED) panels, which have an emission strength up to 800 µmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOCs) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light- and temperature- dependent BVOC emissions are studied using six Quercus ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental setup and the utility of the newly added plant chamber.

  17. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  18. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  19. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    Science.gov (United States)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  20. Tools and approaches for simplifying serious games development in educational settings

    OpenAIRE

    Calvo, Antonio; Rotaru, Dan C.; Freire, Manuel; Fernandez-Manjon, Baltasar

    2016-01-01

    Serious Games can benefit from the commercial video games industry by taking advantage of current development tools. However, the economics and requirements of serious games and commercial games are very different. In this paper, we describe the factors that impact the total cost of ownership of serious games used in educational settings, review the specific requirements of games used as learning material, and analyze the different development tools available in the industry highlighting thei...

  1. Validating the WHO maternal near miss tool: comparing high- and low-resource settings.

    Science.gov (United States)

    Witteveen, Tom; Bezstarosti, Hans; de Koning, Ilona; Nelissen, Ellen; Bloemenkamp, Kitty W; van Roosmalen, Jos; van den Akker, Thomas

    2017-06-19

    WHO proposed the WHO Maternal Near Miss (MNM) tool, classifying women according to several (potentially) life-threatening conditions, to monitor and improve quality of obstetric care. The objective of this study is to analyse merged data of one high- and two low-resource settings where this tool was applied and test whether the tool may be suitable for comparing severe maternal outcome (SMO) between these settings. Using three cohort studies that included SMO cases, during two-year time frames in the Netherlands, Tanzania and Malawi we reassessed all SMO cases (as defined by the original studies) with the WHO MNM tool (five disease-, four intervention- and seven organ dysfunction-based criteria). Main outcome measures were prevalence of MNM criteria and case fatality rates (CFR). A total of 3172 women were studied; 2538 (80.0%) from the Netherlands, 248 (7.8%) from Tanzania and 386 (12.2%) from Malawi. Total SMO detection was 2767 (87.2%) for disease-based criteria, 2504 (78.9%) for intervention-based criteria and 1211 (38.2%) for organ dysfunction-based criteria. Including every woman who received ≥1 unit of blood in low-resource settings as life-threatening, as defined by organ dysfunction criteria, led to more equally distributed populations. In one third of all Dutch and Malawian maternal death cases, organ dysfunction criteria could not be identified from medical records. Applying solely organ dysfunction-based criteria may lead to underreporting of SMO. Therefore, a tool based on defining MNM only upon establishing organ failure is of limited use for comparing settings with varying resources. In low-resource settings, lowering the threshold of transfused units of blood leads to a higher detection rate of MNM. We recommend refined disease-based criteria, accompanied by a limited set of intervention- and organ dysfunction-based criteria to set a measure of severity.

  2. Development of a FASTBUS data acquisition system for the SAPHIR calorimeter

    International Nuclear Information System (INIS)

    Klein, F.J.

    1992-01-01

    Due to the high duty cycle of the new Electron Accelerator at the Physics Institute of Bonn University, ELSA, experiments with tagged photon beams and a large angular acceptance become possible. The new magnetic detector SAPHIR is layed out to detect multi-particle final states with good accuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter is built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A modified shower counter module, containing 20 μm thick signal wires, was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/E*√E(GeV) = 12.2±0.5% was achieved. For data acquisition a modular FASTBUS system was used, which will be installed in the SAPHIR Online Program. (orig.) [de

  3. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model

    International Nuclear Information System (INIS)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.) [de

  4. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  5. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    Science.gov (United States)

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  6. Am I getting an accurate picture: a tool to assess clinical handover in remote settings?

    Directory of Open Access Journals (Sweden)

    Malcolm Moore

    2017-11-01

    Full Text Available Abstract Background Good clinical handover is critical to safe medical care. Little research has investigated handover in rural settings. In a remote setting where nurses and medical students give telephone handover to an aeromedical retrieval service, we developed a tool by which the receiving clinician might assess the handover; and investigated factors impacting on the reliability and validity of that assessment. Methods Researchers consulted with clinicians to develop an assessment tool, based on the ISBAR handover framework, combining validity evidence and the existing literature. The tool was applied ‘live’ by receiving clinicians and from recorded handovers by academic assessors. The tool’s performance was analysed using generalisability theory. Receiving clinicians and assessors provided feedback. Results Reliability for assessing a call was good (G = 0.73 with 4 assessments. The scale had a single factor structure with good internal consistency (Cronbach’s alpha = 0.8. The group mean for the global score for nurses and students was 2.30 (SD 0.85 out of a maximum 3.0, with no difference between these sub-groups. Conclusions We have developed and evaluated a tool to assess high-stakes handover in a remote setting. It showed good reliability and was easy for working clinicians to use. Further investigation and use is warranted beyond this setting.

  7. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Science.gov (United States)

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  8. A set of tools for determining the LAT performance in specific applications

    International Nuclear Information System (INIS)

    Lott, B.; Ballet, J.; Chiang, J.; Lonjou, V.; Funk, S.

    2007-01-01

    The poster presents a set of simple tools being developed to predict GLAST's performance for specific cases, like the accumulation time needed to reach a given significance or statistical accuracy for a particular source. Different examples are given, like the generation of a full-sky sensitivity map

  9. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  10. Particle identification by time-of-flight measurement in the SAPHIR

    International Nuclear Information System (INIS)

    Hoffmann-Rothe, P.

    1993-02-01

    Using photoproduction data which have been measured with the SAPHIR-detector with different target materials (C H 2 solid , H 2 liquid , D 2 liquid ) a detailed investigation and discussion of the detectors performance to measure the time of flight of charged particles and to separate between particles of different mass has been accomplished. A FORTRAN program has been written which provides a calibration of the scintillator panels of the TOF hodoscopes, calculates correction factors for the time-walk effect an finally, by combining the time of flight with track momentum measurement, determines particle masses. The current configuration of the detector makes it possible to separate between proton and pion up to a particle momentum of 1.6 GeV/c. Proton and kaon can be separated up to a momentum of 1.3 GeV/c, kaon and pion up to a momentum of 0.85 GeV/c. (prog.) [de

  11. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    LENUS (Irish Health Repository)

    Hennerby, Cathy

    2012-02-01

    AIM: This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. BACKGROUND: The increased number of registered general agency nurses working in an acute children\\'s hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about \\'near misses\\

  12. AORN Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings.

    Science.gov (United States)

    Hughes, Nancy L; Nelson, Audrey; Matz, Mary W; Lloyd, John

    2011-06-01

    Prolonged standing during surgical procedures poses a high risk of causing musculoskeletal disorders, including back, leg, and foot pain, which can be chronic or acute in nature. Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings provides recommendations for relieving the strain of prolonged standing, including the use of antifatigue mats, supportive footwear, and sit/stand stools, that are based on well-accepted ergonomic safety concepts, current research, and access to new and emerging technology. Published by Elsevier Inc.

  13. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  14. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  15. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    Science.gov (United States)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  16. Instructor's Perceptions towards the Use of an Online Instructional Tool in an Academic English Setting in Kuwait

    Science.gov (United States)

    Erguvan, Deniz

    2014-01-01

    This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…

  17. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  18. Development of a multilevel health and safety climate survey tool within a mining setting.

    Science.gov (United States)

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  19. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2012-07-01

    Full Text Available During recent field campaigns, hydroxyl radical (OH concentrations that were measured by laser-induced fluorescence (LIF were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK, methacrolein (MACR and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD, China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS. Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s−1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03 × 106 cm−3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30–40% (median larger than those by DOAS after MVK (20 ppbv and

  20. Peac – A set of tools to quickly enable Proof on a cluster

    International Nuclear Information System (INIS)

    Ganis, G; Vala, M

    2012-01-01

    With advent of the analysis phase of Lhcdata-processing, interest in Proof technology has considerably increased. While setting up a simple Proof cluster for basic usage is reasonably straightforward, exploiting the several new functionalities added in recent times may be complicated. Peac, standing for Proof Enabled Analysis Cluster, is a set of tools aiming to facilitate the setup and management of a Proof cluster. Peac is based on the experience made by setting up Proof for the Alice analysis facilities. It allows to easily build and configure Root and the additional software needed on the cluster, and may serve as distributor of binaries via Xrootd. Peac uses Proof-On-Demand (PoD) for resource management (start, stop or daemons). Finally, Peac sets-up and configures dataset management (using the Afdsmgrd daemon), as well as cluster monitoring (machine status and Proof query summaries) using MonAlisa. In this respect, a MonAlisa page has been dedicated to Peac users, so that a cluster managed by Peac can be automatically monitored. In this paper we present and describe the status and main components of Peac and show details about its usage.

  1. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  2. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    Science.gov (United States)

    Hennerby, Cathy; Joyce, Pauline

    2011-03-01

    This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. The increased number of registered general agency nurses working in an acute children's hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about 'near misses', parental dissatisfaction, perceived competency weaknesses and rising cost associated with their use. [Young's (2009) Journal of Organisational Change, 22, 524-548] nine-stage change framework was used to guide the implementation of the competency assessment tool within a paediatric acute care setting. The ongoing success of the initiative, from a nurse manager's perspective, relies on structured communication with the agency provider before employing competent agency nurses. Sustainability of the change will depend on nurse managers' persistence in attending the concerns of those resisting the change while simultaneously supporting those championing the change. These key communication and supporting roles highlight the pivotal role held by nurse managers, as gate keepers, in safe-guarding children while in hospital. Leadership qualities of nurse managers will also be challenged in continuing to manage and drive the change where resistance might prevail. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  3. Improving beam set-up using an online beam optics tool

    International Nuclear Information System (INIS)

    Richter, S.; Barth, W.; Franczak, B.; Scheeler, U.; Wilms, D.

    2004-01-01

    The GSI accelerator facility [1] consists of the Universal Linear Accelerator (Unilac), the heavy ion synchrotron SIS, and the Experimental Storage Ring (ESR). Two Unilac injectors with three ion source terminals provide ion species from the lightest such as hydrogen up to uranium. The High Current Injector (HSI) for low charge state ion beams provides mostly high intense but short pulses, whereas the High Charge State Injector (HLI) supplies long pulses with a high duty factor of up to 27%. Before entering the Alvarez section of the Unilac the ion beam from the HSI is stripped in a supersonic gas jet. Up to three different ion species can be accelerated for up to five experiments in a time-sharing mode. Frequent changes of beam energy and intensity during a single beam time period may result in time consuming set-up and tuning especially of the beam transport lines. To shorten these changeover times an online optics tool (MIRKO EXPERT) had been developed. Based on online emittance measurements at well-defined locations the beam envelopes are calculated using the actual magnet settings. With this input improved calculated magnet settings can be directly sent to the magnet power supplies. The program reads profile grid measurements, such that an atomized beam alignment is established and that steering times are minimized. Experiences on this tool will be reported. At the Unilac a special focus is put on high current operation with short but intense beam pulses. Limitations like missing non-destructive beam diagnostics, insufficient longitudinal beam diagnostics, insufficient longitudinal beam matching, and influence of the hard edged model for magnetic fields will be discussed. Special attention will be put on the limits due to high current effects with bunched beams. (author)

  4. Measurement of the reaction γd →pnπ+π- at SAPHIR and investigation of the decay angular distribution of the Δ++(1232) resonance

    International Nuclear Information System (INIS)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E γ = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the Δ ++ (1232) in the reaction γd → nΔ ++ π - are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [de

  5. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    Energy Technology Data Exchange (ETDEWEB)

    Faghihi, F. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Research Center for Radiation Protection, Shiraz University, Shiraz (Iran, Islamic Republic of); Nuclear Safety Research Center, Shiraz University, Shiraz (Iran, Islamic Republic of)], E-mail: faghihif@shirazu.ac.ir; Ramezani, E. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Yousefpour, F. [Atomic Energy Organization of Iran (AEOI), Tehran (Iran, Islamic Republic of); Mirvakili, S.M. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of)

    2008-10-15

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation.

  6. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    International Nuclear Information System (INIS)

    Faghihi, F.; Ramezani, E.; Yousefpour, F.; Mirvakili, S.M.

    2008-01-01

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation

  7. Total OH reactivity study from VOC photochemical oxidation in the SAPHIR chamber

    Science.gov (United States)

    Yu, Z.; Tillmann, R.; Hohaus, T.; Fuchs, H.; Novelli, A.; Wegener, R.; Kaminski, M.; Schmitt, S. H.; Wahner, A.; Kiendler-Scharr, A.

    2015-12-01

    It is well known that hydroxyl radicals (OH) act as a dominant reactive species in the degradation of VOCs in the atmosphere. In recent field studies, directly measured total OH reactivity often showed poor agreement with OH reactivity calculated from VOC measurements (e.g. Nölscher et al., 2013; Lu et al., 2012a). This "missing OH reactivity" is attributed to unaccounted biogenic VOC emissions and/or oxidation products. The comparison of total OH reactivity being directly measured and calculated from single component measurements of VOCs and their oxidation products gives us a further understanding on the source of unmeasured reactive species in the atmosphere. This allows also the determination of the magnitude of the contribution of primary VOC emissions and their oxidation products to the missing OH reactivity. A series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, to explore in detail the photochemical degradation of VOCs (isoprene, ß-pinene, limonene, and D6-benzene) by OH. The total OH reactivity was determined from the measurement of VOCs and their oxidation products by a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF-MS) with a GC/MS/FID system, and directly measured by a laser-induced fluorescence (LIF) at the same time. The comparison between these two total OH reactivity measurements showed an increase of missing OH reactivity in the presence of oxidation products of VOCs, indicating a strong contribution to missing OH reactivity from uncharacterized oxidation products.

  8. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) Version 5.0. Fault tree, event tree, and piping ampersand instrumentation diagram (FEP) editors reference manual: Volume 7

    International Nuclear Information System (INIS)

    McKay, M.K.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, and event trees, and piping and instrumentation diagrams (P and IDs). The software is designed to enable the independent use of the graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is comprised of three separate editors (Fault Tree, Event Tree, and Piping and Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen guide of the entire FEP System

  9. Idea: an integrated set of tools for sustainable nuclear decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, M.; Centner, B.; Vanderperre, S.; Wacquier, W.

    2008-01-01

    Decommissioning of nuclear installations constitutes an important challenge and shall prove to the public that the whole nuclear life cycle is fully mastered by the nuclear industry. This could lead to an easier public acceptance of the construction of new nuclear power plants. When ceasing operation, nuclear installations owners and operators are looking for solutions in order to assess and keep decommissioning costs at a reasonable level, to fully characterise waste streams (in particular radiological inventories of difficult-to-measure radionuclides) and to reduce personnel exposure during the decommissioning activities taking into account several project, site and country specific constraints. In response to this need, Tractebel Engineering has developed IDEA (Integrated DEcommissioning Application), an integrated set of computer tools, to support the engineering activities to be carried out in the frame of a decommissioning project. IDEA provides optimized solutions from an economical, environmental, social and safety perspective. (authors)

  10. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  11. Google Sets, Google Suggest, and Google Search History: Three More Tools for the Reference Librarian's Bag of Tricks

    OpenAIRE

    Cirasella, Jill

    2008-01-01

    This article examines the features, quirks, and uses of Google Sets, Google Suggest, and Google Search History and argues that these three lesser-known Google tools warrant inclusion in the resourceful reference librarian’s bag of tricks.

  12. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  13. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  14. Measurement of the reaction {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} at SAPHIR and investigation of the decay angular distribution of the {Delta}{sup ++}(1232) resonance; Messung der Reaktion {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} an SAPHIR und Untersuchung der Zerfallswinkelverteilung der {Delta}{sup ++}(1232)-Resonanz

    Energy Technology Data Exchange (ETDEWEB)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E{sub {gamma}} = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the {Delta}{sup ++}(1232) in the reaction {gamma}d {yields} n{Delta}{sup ++}{pi}{sup -} are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [Deutsch] Im Rahmen dieser Arbeit ist der Aufbau eines Fluessiggas-Targets beschrieben worden, das speziell fuer den Einsatz im SAPHIR-Detektor entwickelt worden ist. Es wurden Funktionen zur Ueberwachung der Targetzelle vorgestellt und ein Sicherheitssystem zum Schutz der zentralen Driftkammer, die das Target unmittelbar umgibt. Weiterhin ist in Simulationsrechnungen untersucht worden, welchen Einfluss die Konstruktion des Targetstreutopfes auf die Messung unterschiedlicher Reaktionen haben kann. Dabei sind bei 50% bis 70% der Ereignisse Treffer in den Aluminiumstuetzen des Targetstreutopfes aufgetreten. Diese starke Beeintraechtigung kann durch eine Neukonstruktion des Streutopfes und der Verwendung von z.B. Rohazell als Streutopffenster vermieden werden. Rohazell zeichnet sich durch eine hohe Festigkeit und grosse Strahlungslaenge aus. An der Neukonstruktion des Streutopfes wird z.Z. gearbeitet. Der zweite Teil der Arbeit beschreibt eine der ersten

  15. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  16. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    Science.gov (United States)

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived

  17. The Distraction in Action Tool©: Feasibility and Usability in Clinical Settings.

    Science.gov (United States)

    Hanrahan, Kirsten; Kleiber, Charmaine; Miller, Ben J; Davis, Heather; McCarthy, Ann Marie

    2017-11-10

    Distraction is a relatively simple, evidence-based intervention to minimize child distress during medical procedures. Timely on-site interventions that instruct parents on distraction coaching are needed. The purpose of this study was to test the feasibility and usability of the Distraction in Action Tool© (DAT©), which 1) predicts child risk for distress with a needle stick and 2) provides individualized instructions for parents on how to be a distraction coach for their child in clinical settings. A mixed-methods descriptive design was used to test feasibility and usability of DAT in the Emergency Department and a Phlebotomy Lab at a large Midwest Academic Medical Center. Twenty parents of children ages 4-10years requiring venipuncture and clinicians performing 13 of those procedures participated. Participants completed an evaluation and participated in a brief interview. The average age of the children was 6.8years, and 80% of parent participants were mothers. Most parents reported the DAT was not difficult to use (84.2%), understandable (100%), and they had a positive experience (89.5%). Clinicians thought DAT was helpful (100%) and did not cause a meaningful delay in workflow (92%). DAT can be used by parents and clinicians to assess their children's risk for procedure related distress and learn distraction techniques to help their children during needle stick procedures. DAT for parents is being disseminated via social media and an open-access website. Further research is needed to disseminate and implement DAT in community healthcare settings. Copyright © 2017. Published by Elsevier Inc.

  18. Older adult mistreatment risk screening: contribution to the validation of a screening tool in a domestic setting.

    Science.gov (United States)

    Lindenbach, Jeannette M; Larocque, Sylvie; Lavoie, Anne-Marise; Garceau, Marie-Luce

    2012-06-01

    ABSTRACTThe hidden nature of older adult mistreatment renders its detection in the domestic setting particularly challenging. A validated screening instrument that can provide a systematic assessment of risk factors can facilitate this detection. One such instrument, the "expanded Indicators of Abuse" tool, has been previously validated in the Hebrew language in a hospital setting. The present study has contributed to the validation of the "e-IOA" in an English-speaking community setting in Ontario, Canada. It consisted of two phases: (a) a content validity review and adaptation of the instrument by experts throughout Ontario, and (b) an inter-rater reliability assessment by home visiting nurses. The adaptation, the "Mistreatment of Older Adult Risk Factors" tool, offers a comprehensive tool for screening in the home setting. This instrument is significant to professional practice as practitioners working with older adults will be better equipped to assess for risk of mistreatment.

  19. TOPAS 1 - construction and test of a scintillation counter hodoscope for the tagging of bremsstrahlung photons for the SAPHIR detector

    International Nuclear Information System (INIS)

    Merkel, R.

    1989-09-01

    The development of a tagging-hodoscope for the SAPHIR-detector at the stretcher ring ELSA in Bonn is described. The hodoscope covers the energy range 2.175 GeV γ 0 =3.500 GeV. 24 scintillation counters are used for the determination of the photon energy, giving a resolution of ΔE γ =25 MeV. The tagging method requires a good coincidence timing resoluting τ between the tagging hodoscope and the detector for the photon-induced reactions in order to keep the accidental coincidences low. The timing information is given by 8 fast timing counters (40 mm thick), covering 5 up to 7 energy channels each. Fluctuations of the timing signal which result from different impact-locations on the timing counter, due to different light travelling distances, are corrected by the energy defining counters. The timing-component (8 timing counters) is commpleted and tested. The results of first mesurements show an upper limit of σ=250 psec for the resolution of 7 coincidences out of 45 possible channels in the tagging hodscope. These results are obtained with a preliminary adjustment of the SAPHIR beam-line and with a not yet optimized signal to noize ratio in the extracted beam. We hope to obtain a σ<200 psec under optimized conditions. (orig.)

  20. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Science.gov (United States)

    Alves, Mara L.; Brites, Cláudia; Paulo, Manuel; Carbas, Bruna; Belo, Maria; Mendes-Moreira, Pedro M. R.; Brites, Carla; Bronze, Maria do Rosário; Gunjača, Jerko; Šatović, Zlatko; Vaz Patto, Maria C.

    2017-01-01

    Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber), flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds). These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI) model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds) could still be found. Regarding the agronomic performance, farmers' maize populations

  1. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Directory of Open Access Journals (Sweden)

    Mara L. Alves

    2017-12-01

    Full Text Available Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber, flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds. These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds could still be found. Regarding the agronomic performance, farmers

  2. Lung ultrasound as a diagnostic tool for radiographically-confirmed pneumonia in low resource settings.

    Science.gov (United States)

    Ellington, Laura E; Gilman, Robert H; Chavez, Miguel A; Pervaiz, Farhan; Marin-Concha, Julio; Compen-Chang, Patricia; Riedel, Stefan; Rodriguez, Shalim J; Gaydos, Charlotte; Hardick, Justin; Tielsch, James M; Steinhoff, Mark; Benson, Jane; May, Evelyn A; Figueroa-Quintanilla, Dante; Checkley, William

    2017-07-01

    Pneumonia is a leading cause of morbidity and mortality in children worldwide; however, its diagnosis can be challenging, especially in settings where skilled clinicians or standard imaging are unavailable. We sought to determine the diagnostic accuracy of lung ultrasound when compared to radiographically-confirmed clinical pediatric pneumonia. Between January 2012 and September 2013, we consecutively enrolled children aged 2-59 months with primary respiratory complaints at the outpatient clinics, emergency department, and inpatient wards of the Instituto Nacional de Salud del Niño in Lima, Peru. All participants underwent clinical evaluation by a pediatrician and lung ultrasonography by one of three general practitioners. We also consecutively enrolled children without respiratory symptoms. Children with respiratory symptoms had a chest radiograph. We obtained ancillary laboratory testing in a subset. Final clinical diagnoses included 453 children with pneumonia, 133 with asthma, 103 with bronchiolitis, and 143 with upper respiratory infections. In total, CXR confirmed the diagnosis in 191 (42%) of 453 children with clinical pneumonia. A consolidation on lung ultrasound, which is our primary endpoint for pneumonia, had a sensitivity of 88.5%, specificity of 100%, and an area under-the-curve of 0.94 (95% CI 0.92-0.97) when compared to radiographically-confirmed clinical pneumonia. When any abnormality on lung ultrasound was compared to radiographically-confirmed clinical pneumonia the sensitivity increased to 92.2% and the specificity decreased to 95.2%, with an area under-the-curve of 0.94 (95% CI 0.91-0.96). Lung ultrasound had high diagnostic accuracy for the diagnosis of radiographically-confirmed pneumonia. Added benefits of lung ultrasound include rapid testing and high inter-rater agreement. Lung ultrasound may serve as an alternative tool for the diagnosis of pediatric pneumonia. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights

  3. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    Science.gov (United States)

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  4. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  5. Investigation of the oxidation of methyl vinyl ketone (MVK) by OH radicals in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Albrecht, Sascha; Acir, Ismail-Hakki; Bohn, Birger; Breitenlechner, Martin; Dorn, Hans-Peter; Gkatzelis, Georgios I.; Hofzumahaus, Andreas; Holland, Frank; Kaminski, Martin; Keutsch, Frank N.; Novelli, Anna; Reimer, David; Rohrer, Franz; Tillmann, Ralf; Vereecken, Luc; Wegener, Robert; Zaytsev, Alexander; Kiendler-Scharr, Astrid; Wahner, Andreas

    2018-06-01

    The photooxidation of methyl vinyl ketone (MVK) was investigated in the atmospheric simulation chamber SAPHIR for conditions at which organic peroxy radicals (RO2) mainly reacted with NO (high NO case) and for conditions at which other reaction channels could compete (low NO case). Measurements of trace gas concentrations were compared to calculated concentration time series applying the Master Chemical Mechanism (MCM version 3.3.1). Product yields of methylglyoxal and glycolaldehyde were determined from measurements. For the high NO case, the methylglyoxal yield was (19 ± 3) % and the glycolaldehyde yield was (65 ± 14) %, consistent with recent literature studies. For the low NO case, the methylglyoxal yield reduced to (5 ± 2) % because other RO2 reaction channels that do not form methylglyoxal became important. Consistent with literature data, the glycolaldehyde yield of (37 ± 9) % determined in the experiment was not reduced as much as implemented in the MCM, suggesting additional reaction channels producing glycolaldehyde. At the same time, direct quantification of OH radicals in the experiments shows the need for an enhanced OH radical production at low NO conditions similar to previous studies investigating the oxidation of the parent VOC isoprene and methacrolein, the second major oxidation product of isoprene. For MVK the model-measurement discrepancy was up to a factor of 2. Product yields and OH observations were consistent with assumptions of additional RO2 plus HO2 reaction channels as proposed in literature for the major RO2 species formed from the reaction of MVK with OH. However, this study shows that also HO2 radical concentrations are underestimated by the model, suggesting that additional OH is not directly produced from RO2 radical reactions, but indirectly via increased HO2. Quantum chemical calculations show that HO2 could be produced from a fast 1,4-H shift of the second most important MVK derived RO2 species (reaction rate constant 0

  6. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455). PMID:25520777

  7. An approach and a tool for setting sustainable energy retrofitting strategies referring to the 2010 EP

    Directory of Open Access Journals (Sweden)

    Charlot-Valdieu, C.

    2011-10-01

    Full Text Available The 2010 EPBD asks for an economic and social analysis in order to preserve social equity and to promote innovation and building productivity. This is possible with a life cycle energy cost (LCEC analysis, such as with the SEC (Sustainable Energy Cost model whose bottom up approach begins with a building typology including inhabitants. Then the analysis of some representative buildings includes the identification of a technico-economical optimum and energy retrofitting scenarios for each retrofitting programme and the extrapolation for the whole building stock. An extrapolation for the whole building stock allows to set up the strategy and to identify the needed means for reaching the objectives. SEC is a decision aid tool for optimising sustainable energy retrofitting strategies for buildings at territorial and patrimonial scales inside a sustainable development approach towards the factor 4. Various versions of the SEC model are now available for housing and for tertiary buildings.

    La directiva europea de 2010 sobre eficiencia energética en los edificios exige un análisis económico y social con el objetivo de preservar la equidad social, promover la innovación y reforzar la productividad en la construcción. Esto es posible con el análisis del coste global ampliado y especialmente con el modelo SEC. El análisis “bottom up” realizado con la SEC se basa en una tipología de edificio/usuario y en el análisis de edificios representativos: la identificación del óptimo técnico-económico y elaboración de escenarios antes de hacer una extrapolación al conjunto del parque. SEC es una herramienta de ayuda a la decisión para desarrollar estrategias territoriales o patrimoniales de rehabilitación energética. Existen diversas versiones del modelo: para edificios residenciales (unifamiliares y plurifamiliares, públicos y privados y para edificios terciarios.

  8. Comparison of OH Reactivity Instruments in the Atmosphere Simulation Chamber SAPHIR.

    Science.gov (United States)

    Fuchs, H.; Novelli, A.; Rolletter, M.; Hofzumahaus, A.; Pfannerstill, E.; Edtbauer, A.; Kessel, S.; Williams, J.; Michoud, V.; Dusanter, S.; Locoge, N.; Zannoni, N.; Gros, V.; Truong, F.; Sarda Esteve, R.; Cryer, D. R.; Brumby, C.; Whalley, L.; Stone, D. J.; Seakins, P. W.; Heard, D. E.; Schoemaecker, C.; Blocquet, M.; Fittschen, C. M.; Thames, A. B.; Coudert, S.; Brune, W. H.; Batut, S.; Tatum Ernest, C.; Harder, H.; Elste, T.; Bohn, B.; Hohaus, T.; Holland, F.; Muller, J. B. A.; Li, X.; Rohrer, F.; Kubistin, D.; Kiendler-Scharr, A.; Tillmann, R.; Andres, S.; Wegener, R.; Yu, Z.; Zou, Q.; Wahner, A.

    2017-12-01

    Two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016 to compare hydroxyl (OH) radical reactivity (kOH) measurements. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapor, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements is higher for instruments directly detecting hydroxyl radicals (OH), whereas the indirect Comparative Reactivity Method (CRM) has a higher limit of detection of 2s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapor or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected in the chamber to simulate urban and forested environments. Overall, the results show that instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to the reference were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds. In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM measurements is most likely limited by the corrections that need to be applied in order to account for known effects of, for example, deviations from pseudo-first order conditions, nitrogen oxides or water vapor on the measurement

  9. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Novelli, Anna; Rolletter, Michael; Hofzumahaus, Andreas; Pfannerstill, Eva Y.; Kessel, Stephan; Edtbauer, Achim; Williams, Jonathan; Michoud, Vincent; Dusanter, Sebastien; Locoge, Nadine; Zannoni, Nora; Gros, Valerie; Truong, Francois; Sarda-Esteve, Roland; Cryer, Danny R.; Brumby, Charlotte A.; Whalley, Lisa K.; Stone, Daniel; Seakins, Paul W.; Heard, Dwayne E.; Schoemaecker, Coralie; Blocquet, Marion; Coudert, Sebastien; Batut, Sebastien; Fittschen, Christa; Thames, Alexander B.; Brune, William H.; Ernest, Cheryl; Harder, Hartwig; Muller, Jennifer B. A.; Elste, Thomas; Kubistin, Dagmar; Andres, Stefanie; Bohn, Birger; Hohaus, Thorsten; Holland, Frank; Li, Xin; Rohrer, Franz; Kiendler-Scharr, Astrid; Tillmann, Ralf; Wegener, Robert; Yu, Zhujun; Zou, Qi; Wahner, Andreas

    2017-10-01

    Hydroxyl (OH) radical reactivity (kOH) has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements (limit of detection CRM) has a higher limit of detection of 2 s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapour or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds (mixing ratio of OH reactants were up to 10 ppbv). In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM

  10. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    Science.gov (United States)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  11. Early wound infection identification using the WIRE tool in community health care settings: An audit report.

    Science.gov (United States)

    Siaw-Sakyi, Vincent

    2017-12-01

    Wound infection is proving to be a challenge for health care professionals. The associated complications and cost of wound infection is immense and can lead to death in extreme cases. Current management of wound infection is largely subjective and relies on the knowledge of the health care professional to identify and initiate treatment. In response, we have developed an infection prediction and assessment tool. The Wound Infection Risk-Assessment and Evaluation tool (WIRE) and its management strategy is a tool with the aim to bring objectivity to infection prediction, assessment and management. A local audit carried out indicated a high infection prediction rate. More work is being done to improve its effectiveness.

  12. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  13. Selection of the optimal set of revenue management tools in hotels

    OpenAIRE

    Korzh, Nataliia; Onyshchuk, Natalia

    2017-01-01

    The object of research is the scientific category «revenue management» and its tools, which, with the growth of the number of on-line sales channels of hotel services, become decisive in the struggle for survival. The existence of a large number of profit management tools associated with the online booking regime work as a SmallDat and gives quite scattered information about the state of the market. One of the most problematic areas is the formation of perspective analytics using existing too...

  14. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  15. Developing a free and easy to use digital goal setting tool for busy mums

    Directory of Open Access Journals (Sweden)

    Babs Evans

    2015-09-01

    Using data, research and the expertise of commercial and charity partners was an effective way to design a digital product to support behavioural change. By understanding the target audience from the beginning and involving them in the planning stages, the organisations were able to develop a tool the users want with a strong focus on user experience.

  16. Contribution to the microwave characterisation of superconductive materials by means of sapphire resonators; Contribution a la caracterisation hyperfrequence de materiaux supraconducteurs par des resonateurs-saphirs

    Energy Technology Data Exchange (ETDEWEB)

    Hanus, Xavier

    1993-12-06

    The objective of this research thesis is to find a compact resonant structure which would allow the residual surface impedance of superconductive samples to be simply, quickly and economically characterised. The author first explains why he decided to use a sapphire single-crystal as inner dielectric, given some performance reached by resonant structures equipped with such inner dielectrics, and given constraints adopted from the start. He explains the origin of microwave losses which appear in this type of resonant structure, i.e. respectively the surface impedance as far as metallic losses are concerned, and the sapphire dielectric loss angle for as far as dielectric losses are concerned. The experimental installation and the principle of microwave measurements are described. The performance of different possible solutions of resonant structures from starting criteria is presented. The solution of the cavity-sapphire with a TE{sub 011} resonant mode is derived [French] Le but de cette etude est de trouver une structure resonnante compacte permettant de caracteriser simplement, rapidement et economiquement l'impedance de surface residuelle d'echantillons supraconducteurs. Les contraintes de mise en oeuvre et les performances atteintes par des resonateurs avec saphirs synthetiques justifient le choix d'un tel dielectrique a faible angle de perte. L'evaluation des performances experimentales appuyee par des modelesanalytiques permet de rejeter differentes solutions. Ainsi les resonateurs fermes avec saphirs minces sont rejetes en raison des mauvais contacts metalliques. Les resonateurs ouverts avec saphirs minces et epais sont egalement rejetes, meme pour les modes de resonance en principe confines, en raison des pertes par rayonnement. La seule solution est donc d'utiliser une cavite-saphir TE{sub 011} qui offre une configuration de champs naturellement confines. Des mesures sur une premiere cavite en niobium massif ont permis de selectionner un saphir obtenu par

  17. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  18. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  19. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  20. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    Science.gov (United States)

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  1. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2017-10-01

    Full Text Available Hydroxyl (OH radical reactivity (kOH has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds by all instruments. The precision of the measurements (limit of detection  < 1 s−1 at a time resolution of 30 s to a few minutes is higher for instruments directly detecting hydroxyl radicals, whereas the indirect comparative reactivity method (CRM has a higher limit of detection of 2 s−1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO, water vapour or nitric oxide (NO. In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in

  2. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  3. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  4. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    OpenAIRE

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducte...

  5. Reachable Sets of Hidden CPS Sensor Attacks : Analysis and Synthesis Tools

    NARCIS (Netherlands)

    Murguia, Carlos; van de Wouw, N.; Ruths, Justin; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    For given system dynamics, control structure, and fault/attack detection procedure, we provide mathematical tools–in terms of Linear Matrix Inequalities (LMIs)–for characterizing and minimizing the set of states that sensor attacks can induce in the system while keeping the alarm rate of the

  6. Using Multiattribute Utility Theory as a Priority-Setting Tool in Human Services Planning.

    Science.gov (United States)

    Camasso, Michael J.; Dick, Janet

    1993-01-01

    The feasibility of applying multiattribute utility theory to the needs assessment and priority-setting activities of human services planning councils was studied in Essex County (New Jersey). Decision-making and information filtering processes are explored in the context of community planning. (SLD)

  7. Setting UP a decontamination and dismantling (D and D) scenario - methodology and tools developed leopard

    International Nuclear Information System (INIS)

    Pradoura, F.

    2009-01-01

    At the AREVA NC La Hague site, the former nuclear spent fuel reprocessing plant UP2-400 was shutdown on December 30, 2003. Since then, the cleaning up and dismantling activities have been carried by the DV/PRO project, which is the program management organization settled by AREVA NC, for valorization projects. SGN, part of the AREVA NC Engineering Business Unit, operates as the main contractor of the DV/PRO project and provides project management services related to decommissioning and waste management. Hence, SGN is in charge of building D and D's scenarios for all the facilities of the UP2-400 plant, in compliance with safety, technical and financial requirements. Main outputs are logic diagrams, block flow diagrams, wastes and effluents throughputs. In order to meet with AREVA NC's requirements and expectations, SGN developed specific process and tools methods adapted to the scale and complexity of decommissioning a plant with several facilities, with different kind of processes (chemical, mechanical), some of which are in operation and other being dismantled. Considering the number of technical data and inputs to be managed, this methodology leads to complex outputs such as schedules, throughputs, work packages... The development, the maintenance and the modification of these outputs become more and more difficult with the complexity and the size of the plant considered. To cope with these issues, SGN CDE/DEM UP2-400 project team has developed a dedicated tool to assist and optimize in elaborating D and D scenarios. This tool is named LEOPARD (Logiciel d'Elaboration et d'Optimisation des Programmes d'Assainissement Radiologique et de Demantelement) (Software for the Development and Optimization of Radiological Clean up and Dismantling Programs). The availability of this tool allowed the rapid construction of a test case (demonstrator) that has convinced DV/PRO of its numerous advantages and of the future further development potentials. Presentations of LEOPARD

  8. First measurement of the reactions γp→K+Λ and γp→K+Σ0 with SAPHIR at ELSA

    International Nuclear Information System (INIS)

    Lindemann, L.

    1993-04-01

    This report can be subdivided into two main parts. The first part concerns the reconstruction program which has been developed to analyse the data taken with the large solid angle detector SAPHIR which is in operation at the Bonn electron accelerator facility ELSA. A survey on this program is given and some improvements as well as the efficiency concerning real data are discussed. The subject of the second part concerns the measurements of the reactions γp→K + Λand γp→K + Σ 0 . The analysis of a sample of data taken with the SAPHIR in June 1992 is discussed in detail. As a result of this analysis total and differential cross sections as well as the recoil polarization for the two processes are presented. In particular the first measurement of the Σ 0 polarization in photoproduction can be reported. (orig.)

  9. Risk Jyouhou Navi (risk information navigator). Web tool for fostering of risk literacy. Set of data

    International Nuclear Information System (INIS)

    Mitsui, Seiichiro

    2003-06-01

    In addition to the conventional public understanding activities, Risk communication study team of Japan Nuclear Cycle Development Institutes (JNC) Tokai Works has started practical studies to promote risk communication with its local communities. Since its establishment in 2001, Risk communication study team has conducted analyses of already available results of public attitude surveys, case studies of domestic and overseas risk communication activities, and development of risk communication tools. A web tool for fostering of risk literacy 'Risk Jyouhou Navi (risk information navigator in English)', was developed as a web content for the official home page of Techno Kouryuu Kan Ricotti (Techno Community Square Ricotti in English)'. The objectives of this content are to provide risk information for public and to provide an electronic platform for promoting risk communication with the local community. To develop 'Risk Jyouhou Navi', the following concepts were considered. 1) To create public interest in risks in daily lives and in global risks. 2) To provide risk knowledge and information. 3) To support risk communication activities in Techno community square ricotti. (author)

  10. TAM 2.0: tool for MicroRNA set analysis.

    Science.gov (United States)

    Li, Jianwei; Han, Xiaofen; Wan, Yanping; Zhang, Shan; Zhao, Yingshu; Fan, Rui; Cui, Qinghua; Zhou, Yuan

    2018-06-06

    With the rapid accumulation of high-throughput microRNA (miRNA) expression profile, the up-to-date resource for analyzing the functional and disease associations of miRNAs is increasingly demanded. We here describe the updated server TAM 2.0 for miRNA set enrichment analysis. Through manual curation of over 9000 papers, a more than two-fold growth of reference miRNA sets has been achieved in comparison with previous TAM, which covers 9945 and 1584 newly collected miRNA-disease and miRNA-function associations, respectively. Moreover, TAM 2.0 allows users not only to test the functional and disease annotations of miRNAs by overrepresentation analysis, but also to compare the input de-regulated miRNAs with those de-regulated in other disease conditions via correlation analysis. Finally, the functions for miRNA set query and result visualization are also enabled in the TAM 2.0 server to facilitate the community. The TAM 2.0 web server is freely accessible at http://www.scse.hebut.edu.cn/tam/ or http://www.lirmed.com/tam2/.

  11. Laparohysteroscopy in female infertility: A diagnostic cum therapeutic tool in Indian setting.

    Science.gov (United States)

    Puri, Suman; Jain, Dinesh; Puri, Sandeep; Kaushal, Sandeep; Deol, Satjeet Kaur

    2015-01-01

    To evaluate the role of laparohysteroscopy in female infertility andto study the effect of therapeutic procedures in achieving fertility. Patients with female infertility presenting to outpatient Department of Obstetrics and Gynecology were evaluated over a period of 18 months. Fifty consenting subjects excluding male factor infertility with normal hormonal profile and no contraindication to laparoscopy were subject to diagnostic laparoscopy and hysteroscopy. T-test. We studied 50 patients comprising of 24 (48%) cases of primary infertility and 26 (52%) patients of secondary infertility. The average age of active married life for 50 patients was between 8 and 9 years. In our study, the most commonly found pathologies were PCOD, endometroisis and tubal blockage. 11 (28.2) patients conceived after laparohysteroscopy followed by artificial reproductive techniques. This study demonstrates the benefit of laparohysteroscopy for diagnosis and as a therapeutic tool in patients with primary and secondary infertility. We were able to achieve a higher conception rate of 28.2%.

  12. High Resolution Manometry - an underappreciated tool for examination of dysphagia in a surgical setting

    DEFF Research Database (Denmark)

    Jensen, Jonas Sanberg

    Introduction Examination of dysphagia in Danish surgical departments, rely primarily on upper gastrointestinal endoscopy. When no visible or histological cause can be detected, esophageal motility disorders are important differential diagnosis. In examining these disorders and in evaluating...... gastroesophageal reflux disorder (GERD), High Resolution Esophageal Manometry (HRM), provide valuable insights. The purpose of this study was to examine referrals and final diagnosis from HRM in a surgical center specializing in esophageal disorders. Methods and Procedures All patients referred to HRM at our.......1% based on 10419 endoscopies. Conclusion HRM is an important diagnostic tool and supplements upper gastrointestinal endoscopy in examination of dysphagia as well as GERD, with significant differences in patterns of motility disorders. Knowledge and availability of HRM increases use at a surgical center...

  13. Development of a Physical Environmental Observational Tool for Dining Environments in Long-Term Care Settings.

    Science.gov (United States)

    Chaudhury, Habib; Keller, Heather; Pfisterer, Kaylen; Hung, Lillian

    2017-11-10

    This paper presents the first standardized physical environmental assessment tool titled Dining Environment Audit Protocol (DEAP) specifically designed for dining spaces in care homes and reports the results of its psychometric properties. Items rated include: adequacy of lighting, glare, personal control, clutter, staff supervision support, restraint use, and seating arrangement option for social interaction. Two scales summarize the prior items and rate the overall homelikeness and functionality of the space. Ten dining rooms in three long-term care homes were selected for assessment. Data were collected over 11 days across 5 weeks. Two trained assessors completed DEAP independently on the same day. Interrater-reliability was completed for lighting, glare, space, homelike aspects, seating arrangements and the two summary scales, homelikeness and functionality of the space. For categorical measures, measure responses were dichotomized at logical points and Cohen's Kappa and concordance on ratings were determined. The two overall rating scales on homelikeness and functionality of space were found to be reliable intraclass correlation coefficient (ICC) (~0.7). The mean rating for homelikeness for Assessor 1 was 3.5 (SD 1.35) and for functionality of the room was 5.3. (SD 0.82; median 5.5). The findings indicate that the tool's interrater-reliability scores are promising. The high concordance on the overall scores for homelikeness and functionality is indicative of the strength of the individual items in generating a reliable global assessment score on these two important aspects of the dining space. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Development and verification of a leningrad NPP unit 1 living PSA model in the INL SAPHIRE code format for prompt operational safety level monitoring

    International Nuclear Information System (INIS)

    Bronislav, Vinnikov

    2007-01-01

    The first part of the paper presents results of the work, that was carried out in complete conformity with the Technical Assignment, which was developed by the Leningrad Nuclear Power Plant. The initial scientific and technical information, contained into the In-Depth Safety Assessment Reports, was given to the author of the work. This information included graphical Fault Trees of Safety Systems and Auxiliary Technical Systems, Event Trees for the necessary number of Initial Events, and also information about failure probabilities of basic components of the nuclear unit. On the basis of this information and fueling it to the Usa Idaho National Laboratory (INL) SAPHIRE code, we have developed an electronic version of the Data Base for failure probabilities of the components of technical systems. Then, we have developed both the electronic versions of the necessary Fault Trees, and an electronic versions of the necessary Event Trees. And at last, we have carried out the linkage of the Event Trees. This work has resulted in the Living PSA (LPSA - Living Probabilistic Safety Assessment) Model of the Leningrad NPP Unit 1. The LPSA-model is completely adapted to be consistent with the USA INL SAPHIRE Risk Monitor. The second part of the paper results in analysis of fire consequences in various places of Leningrad NPP Unit 1. The computations were carried out with the help of the LPSA-model, developed in SAPHIRE code format. On the basis of the computations the order of priority of implementation of fire prevention measures was established. (author)

  15. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Science.gov (United States)

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton-pump inhibitors. Our tool provided insights that confirmed/complemented information gained from randomized-clinical trials. Prospective tool implementation is warranted.

  16. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  17. The L3+C detector, a unique tool-set to study cosmic rays

    International Nuclear Information System (INIS)

    Adriani, O.; Akker, M. van den; Banerjee, S.; Baehr, J.; Betev, B.; Bourilkov, D.; Bottai, S.; Bobbink, G.; Cartacci, A.; Chemarin, M.; Chen, G.; Chen, H.S.; Chiarusi, T.; Dai, C.J.; Ding, L.K.; Duran, I.; Faber, G.; Fay, J.; Grabosch, H.J.; Groenstege, H.; Guo, Y.N.; Gupta, S.; Haller, Ch.; Hayashi, Y.; He, Z.X.; Hebbeker, T.; Hofer, H.; Hoferjun, H.; Huo, A.X.; Ito, N.; Jing, C.L.; Jones, L.; Kantserov, V.; Kawakami, S.; Kittel, W.; Koenig, A.C.; Kok, E.; Korn, A.; Kuang, H.H.; Kuijpers, J.; Ladron de Guevara, P.; Le Coultre, P.; Lei, Y.; Leich, H.; Leiste, R.; Li, D.; Li, L.; Li, Z.C.; Liu, Z.A.; Liu, H.T.; Lohmann, W.; Lu, Y.S.; Ma, X.H.; Ma, Y.Q.; Mil, A. van; Monteleoni, B.; Nahnhauer, R.; Pauss, F.; Parriaud, J.-F.; Petersen, B.; Pohl, M.; Qing, C.R.; Ramelli, R.; Ravindran, K.C.; Rewiersma, P.; Rojkov, A.; Saidi, R.; Schmitt, V.; Schoeneich, B.; Schotanus, D.J.; Shen, C.Q.; Sulanke, H.; Tang, X.W.; Timmermans, C.; Tonwar, S.; Trowitzsch, G.; Unger, M.; Verkooijen, H.; Wang, X.L.; Wang, X.W.; Wang, Z.M.; Wijk, R. van; Wijnen, Th.A.M.; Wilkens, H.; Xu, Y.P.; Xu, Z.Z.; Yang, C.G.; Yang, X.F.; Yao, Z.G.; Yu, Z.Q.; Zhang, S.; Zhu, G.Y.; Zhu, Q.Q.; Zhuang, H.L.; Zwart, A.N.M.

    2002-01-01

    The L3 detector at the CERN electron-positron collider, LEP, has been employed for the study of cosmic ray muons. The muon spectrometer of L3 consists of a set of high-precision drift chambers installed inside a magnet with a volume of about 1000 m 3 and a field of 0.5 T. Muon momenta are measured with a resolution of a few percent at 50 GeV. The detector is located under 30 m of overburden. A scintillator air shower array of 54 m by 30 m is installed on the roof of the surface hall above L3 in order to estimate the energy and the core position of the shower associated with a sample of detected muons. Thanks to the unique properties of the L3+C detector, muon research topics relevant to various current problems in cosmic ray and particle astrophysics can be studied

  18. The L3+C detector, a unique tool-set to study cosmic rays

    CERN Document Server

    Adriani, O; Banerjee, S; Bähr, J; Betev, B L; Bourilkov, D; Bottai, S; Bobbink, Gerjan J; Cartacci, A M; Chemarin, M; Chen, G; Chen He Sheng; Chiarusi, T; Dai Chang Jiang; Ding, L K

    2002-01-01

    The L3 detector at the CERN electron-positron collider, LEP, has been employed for the study of cosmic ray muons. The muon spectrometer of L3 consists of a set of high-precision drift chambers installed inside a magnet with a volume of about 1000 m**3 and a field of 0.5 T. Muon momenta are measured with a resolution of a few percent at 50 GeV. The detector is located under 30 m of overburden. A scintillator air shower array of 54 m by 30 m is installed on the roof of the surface hall above L3 in order to estimate the energy and the core position of the shower associated with a sample of detected muons. Thanks to the unique properties of the L3+C detector, muon research topics relevant to various current problems in cosmic ray and particle astrophysics can be studied.

  19. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting

    Directory of Open Access Journals (Sweden)

    Francisco José Cereceda-Sánchez

    Full Text Available ABSTRACT Objective: to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. Methods: in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. Results: 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2, eight with bicarbonate (HCO3, three with lactate, and four with blood pH. Conclusions: most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way.

  20. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  1. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  2. Validation of Nurse Practitioner Primary Care Organizational Climate Questionnaire: A New Tool to Study Nurse Practitioner Practice Settings.

    Science.gov (United States)

    Poghosyan, Lusine; Chaplin, William F; Shaffer, Jonathan A

    2017-04-01

    Favorable organizational climate in primary care settings is necessary to expand the nurse practitioner (NP) workforce and promote their practice. Only one NP-specific tool, the Nurse Practitioner Primary Care Organizational Climate Questionnaire (NP-PCOCQ), measures NP organizational climate. We confirmed NP-PCOCQ's factor structure and established its predictive validity. A crosssectional survey design was used to collect data from 314 NPs in Massachusetts in 2012. Confirmatory factor analysis and regression models were used. The 4-factor model characterized NP-PCOCQ. The NP-PCOCQ score predicted job satisfaction (beta = .36; p organizational climate in their clinics. Further testing of NP-PCOCQ is needed.

  3. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    Science.gov (United States)

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized

  4. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Reliability and validity of a novel tool to comprehensively assess food and beverage marketing in recreational sport settings.

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Mâsse, Louise C; Storey, Kate; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Current methods for evaluating food marketing to children often study a single marketing channel or approach. As the World Health Organization urges the removal of unhealthy food marketing in children's settings, methods that comprehensively explore the exposure and power of food marketing within a setting from multiple marketing channels and approaches are needed. The purpose of this study was to test the inter-rater reliability and the validity of a novel settings-based food marketing audit tool. The Food and beverage Marketing Assessment Tool for Settings (FoodMATS) was developed and its psychometric properties evaluated in five public recreation and sport facilities (sites) and subsequently used in 51 sites across Canada for a cross-sectional analysis of food marketing. Raters recorded the count of food marketing occasions, presence of child-targeted and sports-related marketing techniques, and the physical size of marketing occasions. Marketing occasions were classified by healthfulness. Inter-rater reliability was tested using Cohen's kappa (κ) and intra-class correlations (ICC). FoodMATS scores for each site were calculated using an algorithm that represented the theoretical impact of the marketing environment on food preferences, purchases, and consumption. Higher FoodMATS scores represented sites with higher exposure to, and more powerful (unhealthy, child-targeted, sports-related, large) food marketing. Validity of the scoring algorithm was tested through (1) Pearson's correlations between FoodMATS scores and facility sponsorship dollars, and (2) sequential multiple regression for predicting "Least Healthy" food sales from FoodMATS scores. Inter-rater reliability was very good to excellent (κ = 0.88-1.00, p marketing in recreation facilities, the FoodMATS provides a novel means to comprehensively track changes in food marketing environments that can assist in developing and monitoring the impact of policies and interventions.

  6. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting.

    Science.gov (United States)

    Cereceda-Sánchez, Francisco José; Molina-Mula, Jesús

    2017-05-15

    to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2), eight with bicarbonate (HCO3), three with lactate, and four with blood pH. most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way. avaliar a utilidade da capnografia para a detecção de alterações metabólicas em pacientes com respiração espontânea, no contexto das emergências e dos cuidados intensivos. pesquisa bibliográfica estruturada aprofundada, nas bases de dados EBSCOhost, Biblioteca Virtual em Saúde, PubMed, Cochrane Library, entre outras, identificando estudos que avaliavam a relação entre os valores da capnografia e as variáveis envolvidas no equilíbrio ácido-base sanguíneo. foram levantados 19 estudos, dois eram revisões e 17 eram estudos observacionais. Em nove estudos, os valores capnográficos foram correlacionados com o dióxido de carbono (CO2), em oito com o bicarbonato (HCO3), em três com o lactato, e em quatro com o pH sanguíneo. na maioria dos estudos foi observada uma correlação adequada entre os valores capnográficos e os biomarcadores sanguíneos, sugerindo a utilidade deste parâmetro para a identificação de pacientes com risco de sofrer uma alteração metabólica grave, de uma forma rápida, econômica e precisa. explorar la utilidad de la capnografía para la detecci

  7. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    Directory of Open Access Journals (Sweden)

    Yu-Chi Lin

    2011-02-01

    Full Text Available Due to the implicit characteristics of learning disabilities (LDs, the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN and support vector machine (SVM have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST, which can not only perform as a classifier, but may also produce meaningful explanations or rules, to the LD diagnosis application. Our experiments indicate that the RST approach is competitive as a tool for feature selection, and it performs better in term of prediction accuracy than other rulebased algorithms such as decision tree and ripper algorithms. We also propose to mix samples collected from sources with different LD diagnosis procedure and criteria. By pre-processing these mixed samples with simple and readily available clustering algorithms, we are able to improve the quality and support of rules generated by the RST. Overall, our study shows that the rough set approach, as a classification and knowledge discovery tool, may have great potential in playing an essential role in LD diagnosis.

  8. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    Directory of Open Access Journals (Sweden)

    Sandeep R Panta

    2016-03-01

    Full Text Available In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS. We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3 JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g. brain volume or voxel values from segmented gray matter images were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples including 1 quality control measures calculated from phantom data over time, 2 quality control data from human functional MRI data across various studies, scanners, sites, 3 volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1 and (2 show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e. data sets with poor QC, clustering of data by site quickly. Results from (3 demonstrate

  9. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  10. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  11. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Directory of Open Access Journals (Sweden)

    Nava Siegelmann-Danieli

    Full Text Available Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes.The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP, FP plus oxaliplatin (FP-O, or FP plus irinotecan (FP-I in the first-line between 9/2006 and 12/2013.The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group. For the entire cohort, median overall survival (OS was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older. Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9-24.0] vs. 18.9 [15.5-21.9] months. Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant. Cox analysis (controlling for age and gender identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton

  12. The e-Reader — an Educational or an Entertainment Tool? e-Readers in an Academic Setting

    Directory of Open Access Journals (Sweden)

    Peter Ahlroos

    2012-01-01

    Full Text Available In this paper the authors will discuss a pilot project conducted at the Tritonia Academic Library, Vaasa, in Finland, from September 2010 until May 2011. The project was designed to investigate the application of e-readers in academic settings and to learn how teachers and students experience the use of e-readers in academic education. Four groups of students and one group of teachers used Kindle readers for varied periods of time in different courses. The course material and the textbooks were downloaded on the e-readers. The feedback from the participants was collected through questionnaires and teacher interviews. The results suggest that the e-reader is a future tool for learning, though some features need to be improved before e-readers can really enable efficient learning and researching.

  13. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    International Nuclear Information System (INIS)

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-01

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  14. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  15. Structural Analysis of PTM Hotspots (SAPH-ire) – A Quantitative Informatics Method Enabling the Discovery of Novel Regulatory Elements in Protein Families*

    Science.gov (United States)

    Dewhurst, Henry M.; Choudhury, Shilpa; Torres, Matthew P.

    2015-01-01

    Predicting the biological function potential of post-translational modifications (PTMs) is becoming increasingly important in light of the exponential increase in available PTM data from high-throughput proteomics. We developed structural analysis of PTM hotspots (SAPH-ire)—a quantitative PTM ranking method that integrates experimental PTM observations, sequence conservation, protein structure, and interaction data to allow rank order comparisons within or between protein families. Here, we applied SAPH-ire to the study of PTMs in diverse G protein families, a conserved and ubiquitous class of proteins essential for maintenance of intracellular structure (tubulins) and signal transduction (large and small Ras-like G proteins). A total of 1728 experimentally verified PTMs from eight unique G protein families were clustered into 451 unique hotspots, 51 of which have a known and cited biological function or response. Using customized software, the hotspots were analyzed in the context of 598 unique protein structures. By comparing distributions of hotspots with known versus unknown function, we show that SAPH-ire analysis is predictive for PTM biological function. Notably, SAPH-ire revealed high-ranking hotspots for which a functional impact has not yet been determined, including phosphorylation hotspots in the N-terminal tails of G protein gamma subunits—conserved protein structures never before reported as regulators of G protein coupled receptor signaling. To validate this prediction we used the yeast model system for G protein coupled receptor signaling, revealing that gamma subunit–N-terminal tail phosphorylation is activated in response to G protein coupled receptor stimulation and regulates protein stability in vivo. These results demonstrate the utility of integrating protein structural and sequence features into PTM prioritization schemes that can improve the analysis and functional power of modification-specific proteomics data. PMID:26070665

  16. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    Science.gov (United States)

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. Investigation of OH Radical Regeneration from Isoprene Oxidation Across Different NOx Regimes in the Atmosphere Simulation Chamber SAPHIR

    Science.gov (United States)

    Novelli, A.; Bohn, B.; Dorn, H. P.; Häseler, R.; Hofzumahaus, A.; Kaminski, M.; Yu, Z.; Li, X.; Tillmann, R.; Wegener, R.; Fuchs, H.; Kiendler-Scharr, A.; Wahner, A.

    2017-12-01

    The hydroxyl radical (OH) is the dominant daytime oxidant in the troposphere. It starts the degradation of volatile organic compounds (VOC) originating from both anthropogenic and biogenic emissions. Hence, it is a crucial trace species in model simulations as it has a large impact on many reactive trace gases. Many field campaigns performed in isoprene dominated environment in low NOx conditions have shown large discrepancies between the measured and the modelled OH radical concentrations. These results have contributed to the discovery of new regeneration paths for OH radicals from isoprene-OH second generation products with maximum efficiency at low NO. The current chemical models (e.g. MCM 3.3.1) include this novel chemistry allowing for an investigation of the validity of the OH regeneration at different chemical conditions. Over 11 experiments focusing on the OH oxidation of isoprene were performed at the SAPHIR chamber in the Forschungszentrum Jülich. Measurements of VOCs, NOx, O3, HONO were performed together with the measurement of OH radicals (by both LIF-FAGE and DOAS) and OH reactivity. Within the simulation chamber, the NO mixing ratio was varied between 0.05 to 2 ppbv allowing the investigation of both the "new" regeneration path for OH radicals and the well-known NO+HO2 mechanism. A comparison with the MCM 3.3.1 that includes the upgraded LIM1 mechanism showed very good agreement (within 10%) for the OH data at all concentrations of NOx investigated. Comparison with different models, without LIM1 and with updated rates for the OH regeneration, will be presented together with a detailed analysis of the impact of this study on results from previous field campaigns.

  18. Chemical analysis of particulate and gaseous products from the monoterpene oxidation in the SAPHIR chamber during the EUCAARI campaign 2008

    Science.gov (United States)

    Kahnt, A.; Iinuma, Y.; Herrmann, H.; Mentel, T. F.; Fisseha, R.; Kiendler-Scharr, A.

    2009-04-01

    The atmospheric oxidation of monoterpenes leads to multifunctional products with lower vapour pressure. These products condense and coagulate to existing particles leading to particle formation and growth. In order to obtain better insights into the mechanisms and the importance of sources to organic aerosol, a mixture of monoterpenes was oxidised in the SAPHIR outdoor chamber during the EUCAARI campaign in 2008. The mixture was made of α-pinene, β-pinene, limonene, 3-carene and ocimene, representing a typical monoterpene emission from a boreal forest. In addition, two sesquiterpenes (α-farnesene and caryophyllene) were reacted together with the monoterpene mixture in some experiments. The VOC (volatile organic compound) mixture was reacted under tropospheric oxidation and light conditions in a prolonged time scale over two days. In the present study, a special emphasis is put on the detection of carbonyl compounds from the off-line analysis of collected filter and denuder samples from the campaign in 2008. The oxidation products which contain carbonyl groups are important first stable intermediates during the monoterpene and sesquiterpene oxidation. They react further with atmospheric oxidants to form lower volatile acidic compounds, contributing to secondary organic aerosol (SOA). Commonly used methods for the analysis of carbonyl compounds involve derivatisation steps prior to separation and subsequent UV or MS detection. In the present study, 2,4-dinitrophenylhydrazine (DNPH) was used to derivatise the extracted filter and denuder samples. The DNPH converts aldehyde- and keto-groups to stable hydrazones, which can be purified afterwards using a solid phase extraction (SPE) cartridge. The derivatised samples were analysed with HPLC/ESI-TOFMS which allowed us to determine the exact chemical formula of unknown products. In addition to known carbonyl compounds from monoterpene oxidation such as pinonaldehyde and nopinon, previously unreported molecular masses

  19. A browser-based 3D Visualization Tool designed for comparing CERES/CALIOP/CloudSAT level-2 data sets.

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Doelling, D. R.

    2017-12-01

    In Langley NASA, Clouds and the Earth's Radiant Energy System (CERES) and Moderate Resolution Imaging Spectroradiometer (MODIS) are merged with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat Cloud Profiling Radar (CPR). The CERES merged product (C3M) matches up to three CALIPSO footprints with each MODIS pixel along its ground track. It then assigns the nearest CloudSat footprint to each of those MODIS pixels. The cloud properties from MODIS, retrieved using the CERES algorithms, are included in C3M with the matched CALIPSO and CloudSat products along with radiances from 18 MODIS channels. The dataset is used to validate the CERES retrieved MODIS cloud properties and the computed TOA and surface flux difference using MODIS or CALIOP/CloudSAT retrieved clouds. This information is then used to tune the computed fluxes to match the CERES observed TOA flux. A visualization tool will be invaluable to determine the cause of these large cloud and flux differences in order to improve the methodology. This effort is part of larger effort to allow users to order the CERES C3M product sub-setted by time and parameter as well as the previously mentioned visualization capabilities. This presentation will show a new graphical 3D-interface, 3D-CERESVis, that allows users to view both passive remote sensing satellites (MODIS and CERES) and active satellites (CALIPSO and CloudSat), such that the detailed vertical structures of cloud properties from CALIPSO and CloudSat are displayed side by side with horizontally retrieved cloud properties from MODIS and CERES. Similarly, the CERES computed profile fluxes whether using MODIS or CALIPSO and CloudSat clouds can also be compared. 3D-CERESVis is a browser-based visualization tool that makes uses of techniques such as multiple synchronized cursors, COLLADA format data and Cesium.

  20. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  1. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    Energy Technology Data Exchange (ETDEWEB)

    HAYENGA, J.L.

    2006-12-19

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements.

  2. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    International Nuclear Information System (INIS)

    HAYENGA, J.L.

    2006-01-01

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements

  3. SAPHIRE technical reference manual: IRRAS/SARA Version 4.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Sattison, M.B.; Rasmuson, D.M.

    1993-01-01

    This report provides information on the principles used in the construction and operation of Version 4.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. The report then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. The report gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  4. Image Navigation and Registration Performance Assessment Tool Set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    Science.gov (United States)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24-hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24-hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  5. Citizen's Charter in a primary health-care setting of Nepal: An accountability tool or a "mere wall poster"?

    Science.gov (United States)

    Gurung, Gagan; Gauld, Robin; Hill, Philip C; Derrett, Sarah

    2018-02-01

    Despite some empirical findings on the usefulness of citizen's charters on awareness of rights and services, there is a dearth of literature about charter implementation and impact on health service delivery in low-income settings. To gauge the level of awareness of the Charter within Nepal's primary health-care (PHC) system, perceived impact and factors affecting Charter implementation. Using a case study design, a quantitative survey was administered to 400 participants from 22 of 39 PHC facilities in the Dang District to gauge awareness of the Charter. Additionally, qualitative interviews with 39 key informants were conducted to explore the perceived impact of the Charter and factors affecting its implementation. Few service users (15%) were aware of the existence of the Charter. Among these, a greater proportion were literate, and there were also differences according to ethnicity and occupational group. The Charter was usually not properly displayed and had been implemented with no prior public consultation. It contained information that provided awareness of health facility services, particularly the more educated public, but had limited potential for increasing transparency and holding service providers accountable to citizens. Proper display, consultation with stakeholders, orientation or training and educational factors, follow-up and monitoring, and provision of sanctions were all lacking, negatively influencing the implementation of the Charter. Poor implementation and low public awareness of the Charter limit its usefulness. Provision of sanctions and consultation with citizens in Charter development are needed to expand the scope of Charters from information brochures to tools for accountability. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  6. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  7. Use of a tool-set by Pan troglodytes troglodytes to obtain termites (Macrotermes) in the periphery of the Dja Biosphere Reserve, southeast Cameroon.

    Science.gov (United States)

    Deblauwe, Isra; Guislain, Patrick; Dupain, Jef; Van Elsacker, Linda

    2006-12-01

    At the northern periphery of the Dja Biosphere Reserve (southeastern Cameroon) we recorded a new use of a tool-set by Pan troglodytes troglodytes to prey on Macrotermes muelleri, M. renouxi, M. lilljeborgi, and M. nobilis. We recovered 79 puncturing sticks and 47 fishing probes at 17 termite nests between 2002 and 2005. The mean length of the puncturing sticks (n = 77) and fishing probes (n = 45) was 52 cm and 56 cm, respectively, and the mean diameter was 9 mm and 4.5 mm, respectively. Sixty-eight percent of 138 chimpanzee fecal samples contained major soldiers of four Macrotermes species. The chimpanzees in southeastern Cameroon appeared to be selective in their choice of plant material to make their tools. The tools found at our study site resemble those from other sites in this region. However, in southeastern Cameroon only one tool-set type was found, whereas two tool-set types have been reported in Congo. Our study suggests that, along with the different vegetation types and the availability of plant material around termite nests, the nest and gallery structure and foraging behavior of the different Macrotermes spp. at all Central African sites must be investigated before we can attribute differences in tool-use behavior to culture. (c) 2006 Wiley-Liss, Inc.

  8. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    Science.gov (United States)

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  9. Measurement of the reaction {gamma}p{yields}K{sup 0}{sigma}{sup +} for photon energies up to 2.65 GeV with the SAPHIR detector at ELSA; Messung der Reaktion {gamma}p {yields} K{sup 0}{sigma}{sup +} fuer Photonenergien bis 2.65 GeV mit dem SAPHIR-Detektor an ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Lawall, R.

    2004-01-01

    The reaction {gamma}p {yields} K{sup 0}{sigma}{sup +} was measured with the SAPHIR-detector at ELSA during the run periods 1997 and 1998. Results were obtained for cross sections in the photon energy range from threshold up to 2.65 GeV for all production angles and for the {sigma}{sup +}-polarization. Emphasis has been put on the determination and reduction of the contributions of background reactions and the comparison with other measurements and theoretical predictions. (orig.)

  10. The evolution of OPUS: A set of web-based GPS processing tools offered by the National Geodetic Survey

    Science.gov (United States)

    Weston, Dr.; Mader, Dr.; Schenewerk, Dr.

    2012-04-01

    The Online Positioning User Service (OPUS) is a suite of web-based GPS processing tools that were initially developed by the National Geodetic Survey approximately eleven years ago. The first version, known as OPUS static (OPUS-S), processes L1 and L2 carrier-phase data in native receiver and RINEX formats. Datasets submitted to OPUS-S must be between two and 48 hours in duration and pass several quality control steps before being passed onto the positioning algorithm. OPUS-S was designed to select five nearby CORS to form baselines that are processed independently. The best three solutions are averaged to produce a final set of coordinates. The current version of OPUS-S has been optimized to accept and process GPS data from any location in the continental United States, Alaska, Hawaii and the Caribbean. OPUS Networks (OPUS-Net), one of the most recently developed versions and currently in beta testing, has many of the same processing characteristics and dataset requirements as OPUS-S but with one significant difference. OPUS-Net selects up to 10 IGS reference sites and three regional CORS to perform a simultaneous least squares adjustment with the user-submitted data. The CORS stations are primarily used to better estimate the troposphere while the position of the unknown station and the three CORS reference stations are determined from the more precisely known and monitored IGS reference stations. Additional enhancements to OPUS-Net are the implementation of absolute antenna patterns and ocean tides (FES2004), using reference station coordinates in IGS08 reference frame, as well as using improved phase ambiguity integer fixing and troposphere modeling (GPT and GMF a priori models). OPUS Projects, the final version of OPUS to be reviewed in this paper, is a complete web-based, GPS data processing and analysis environment. The main idea behind OPUS Projects is that one or more managers can define numerous, independent GPS projects. Each newly defined project is

  11. Photovoice as a Pedagogical Tool: Exploring Personal and Professional Values with Female Muslim Social Work Students in an Intercultural Classroom Setting

    Science.gov (United States)

    Bromfield, Nicole F.; Capous-Desyllas, Moshoula

    2017-01-01

    This article explores a classroom project in which we used photovoice as a pedagogical tool to enhance personal and professional self-awareness among female, Muslim, social work students in an intercultural classroom setting located in the Arabian Gulf. We begin with an overview and discussion of arts-based approaches to education and then provide…

  12. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    Science.gov (United States)

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  13. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  14. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  15. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    Science.gov (United States)

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Piloting a programme tool to evaluate malaria case investigation and reactive case detection activities: results from 3 settings in the Asia Pacific.

    Science.gov (United States)

    Cotter, Chris; Sudathip, Prayuth; Herdiana, Herdiana; Cao, Yuanyuan; Liu, Yaobao; Luo, Alex; Ranasinghe, Neil; Bennett, Adam; Cao, Jun; Gosling, Roly D

    2017-08-22

    Case investigation and reactive case detection (RACD) activities are widely-used in low transmission settings to determine the suspected origin of infection and identify and treat malaria infections nearby to the index patient household. Case investigation and RACD activities are time and resource intensive, include methodologies that vary across eliminating settings, and have no standardized metrics or tools available to monitor and evaluate them. In response to this gap, a simple programme tool was developed for monitoring and evaluating (M&E) RACD activities and piloted by national malaria programmes. During the development phase, four modules of the RACD M&E tool were created to assess and evaluate key case investigation and RACD activities and costs. A pilot phase was then carried out by programme implementers between 2013 and 2015, during which malaria surveillance teams in three different settings (China, Indonesia, Thailand) piloted the tool over a period of 3 months each. This study describes summary results of the pilots and feasibility and impact of the tool on programmes. All three study areas implemented the RACD M&E tool modules, and pilot users reported the tool and evaluation process were helpful to identify gaps in RACD programme activities. In the 45 health facilities evaluated, 71.8% (97/135; min 35.3-max 100.0%) of the proper notification and reporting forms and 20.0% (27/135; min 0.0-max 100.0%) of standard operating procedures (SOPs) were available to support malaria elimination activities. The tool highlighted gaps in reporting key data indicators on the completeness for malaria case reporting (98.8%; min 93.3-max 100.0%), case investigations (65.6%; min 61.8-max 78.4%) and RACD activities (70.0%; min 64.7-max 100.0%). Evaluation of the SOPs showed that knowledge and practices of malaria personnel varied within and between study areas. Average monthly costs for conducting case investigation and RACD activities showed variation between study

  17. Use of a computerized medication shared decision making tool in community mental health settings: impact on psychotropic medication adherence.

    Science.gov (United States)

    Stein, Bradley D; Kogan, Jane N; Mihalyo, Mark J; Schuster, James; Deegan, Patricia E; Sorbero, Mark J; Drake, Robert E

    2013-04-01

    Healthcare reform emphasizes patient-centered care and shared decision-making. This study examined the impact on psychotropic adherence of a decision support center and computerized tool designed to empower and activate consumers prior to an outpatient medication management visit. Administrative data were used to identify 1,122 Medicaid-enrolled adults receiving psychotropic medication from community mental health centers over a two-year period from community mental health centers. Multivariate linear regression models were used to examine if tool users had higher rates of 180-day medication adherence than non-users. Older clients, Caucasian clients, those without recent hospitalizations, and those who were Medicaid-eligible due to disability had higher rates of 180-day medication adherence. After controlling for sociodemographics, clinical characteristics, baseline adherence, and secular changes over time, using the computerized tool did not affect adherence to psychotropic medications. The computerized decision tool did not affect medication adherence among clients in outpatient mental health clinics. Additional research should clarify the impact of decision-making tools on other important outcomes such as engagement, patient-prescriber communication, quality of care, self-management, and long-term clinical and functional outcomes.

  18. First records of tool-set use for ant-dipping by Eastern chimpanzees (Pan troglodytes schweinfurthii) in the Kalinzu Forest Reserve, Uganda.

    Science.gov (United States)

    Hashimoto, Chie; Isaji, Mina; Koops, Kathelijne; Furuichi, Takeshi

    2015-10-01

    Chimpanzees at numerous study sites are known to prey on army ants by using a single wand to dip into the ant nest or column. However, in Goualougo (Republic of Congo) in Central Africa, chimpanzees use a different technique, use of a woody sapling to perforate the ant nest, then use of a herb stem as dipping tool to harvest the army ants. Use of a tool set has also been found in Guinea, West Africa: at Seringbara in the Nimba Mountains and at nearby Bossou. There are, however, no reports for chimpanzees in East Africa. We observed use of such a tool set in Kalinzu, Uganda, for the first time by Eastern chimpanzees. This behavior was observed among one group of chimpanzees at Kalinzu (S-group) but not among the adjacent group (M-group) with partly overlapping ranging areas despite the fact that the latter group has been under intensive observation since 1997. In Uganda, ant-dipping has not been observed in the northern three sites (Budongo, Semliki, and Kibale) but has been observed or seems to occur in the southern sites (Kalinzu and Bwindi), which suggests that ant-dipping was invented by and spread from the southern region after the northern and southern forest blocks became separated. Use of a tool-set by only one group at Kalinzu further suggests that this behavior was recently invented and has not yet spread to the other group via migrating females.

  19. Evaluating a computational support tool for set-based configuration of production systems : Results from an industrial case

    NARCIS (Netherlands)

    Unglert, Johannes; Hoekstra, Sipke; Jauregui Becker, Juan Manuel

    2017-01-01

    This paper describes research conducted in the context of an industrial case dealing with the design of re configurable cellular manufacturing systems. Reconfiguring such systems represents a complex task due to the interdependences between the constituent subsystems. A novel computational tool was

  20. Analysis of metolachlor ethane sulfonic acid chirality in groundwater: A tool for dating groundwater movement in agricultural settings

    Science.gov (United States)

    Chemical chirality of pesticides can be a useful tool for studying environmental processes. The chiral forms of metolachlor ethane sulfonic acid (MESA), an abundant metabolite of metolachlor, and metolachlor were examined over a 6 year period in groundwater and a groundwater-fed stream in a riparia...

  1. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  2. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant...... as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved...

  3. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    OpenAIRE

    Panta, Sandeep R.; Wang, Runtang; Fries, Jill; Kalyanam, Ravi; Speer, Nicole; Banich, Marie; Kiehl, Kent; King, Margaret; Milham, Michael; Wager, Tor D.; Turner, Jessica A.; Plis, Sergey M.; Calhoun, Vince D.

    2016-01-01

    In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI) scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS). We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed ...

  4. Predictive validity of the identification of seniors at risk screening tool in a German emergency department setting.

    Science.gov (United States)

    Singler, Katrin; Heppner, Hans Jürgen; Skutetzky, Andreas; Sieber, Cornel; Christ, Michael; Thiem, Ulrich

    2014-01-01

    The identification of patients at high risk for adverse outcomes [death, unplanned readmission to emergency department (ED)/hospital, functional decline] plays an important role in emergency medicine. The Identification of Seniors at Risk (ISAR) instrument is one of the most commonly used and best-validated screening tools. As to the authors' knowledge so far there are no data on any screening tool for the identification of older patients at risk for a negative outcome in Germany. To evaluate the validity of the ISAR screening tool in a German ED. This was a prospective single-center observational cohort study in an ED of an urban university-affiliated hospital. Participants were 520 patients aged ≥75 years consecutively admitted to the ED. The German version of the ISAR screening tool was administered directly after triage of the patients. Follow-up telephone interviews to assess outcome variables were conducted 28 and 180 days after the index visit in the ED. The primary end point was death from any cause or hospitalization or recurrent ED visit or change of residency into a long-term care facility on day 28 after the index ED visit. The mean age ± SD was 82.8 ± 5.0 years. According to ISAR, 425 patients (81.7%) scored ≥2 points, and 315 patients (60.5%) scored ≥3 points. The combined primary end point was observed in 250 of 520 patients (48.1%) on day 28 and in 260 patients (50.0%) on day 180. Using a continuous ISAR score the area under the curve on day 28 was 0.621 (95% confidence interval, CI 0.573-0.669) and 0.661 (95% CI 0.615-0.708) on day 180, respectively. The German version of the ISAR screening tool acceptably identified elderly patients in the ED with an increased risk of a negative outcome. Using the cutoff ≥3 points instead of ≥2 points yielded better overall results.

  5. SUPPORT Tools for evidence-informed health Policymaking (STP) 3: Setting priorities for supporting evidence-informed policymaking.

    Science.gov (United States)

    Lavis, John N; Oxman, Andrew D; Lewin, Simon; Fretheim, Atle

    2009-12-16

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. Policymakers have limited resources for developing--or supporting the development of--evidence-informed policies and programmes. These required resources include staff time, staff infrastructural needs (such as access to a librarian or journal article purchasing), and ongoing professional development. They may therefore prefer instead to contract out such work to independent units with more suitably skilled staff and appropriate infrastructure. However, policymakers may only have limited financial resources to do so. Regardless of whether the support for evidence-informed policymaking is provided in-house or contracted out, or whether it is centralised or decentralised, resources always need to be used wisely in order to maximise their impact. Examples of undesirable practices in a priority-setting approach include timelines to support evidence-informed policymaking being negotiated on a case-by-case basis (instead of having clear norms about the level of support that can be provided for each timeline), implicit (rather than explicit) criteria for setting priorities, ad hoc (rather than systematic and explicit) priority-setting process, and the absence of both a communications plan and a monitoring and evaluation plan. In this article, we suggest questions that can guide those setting priorities for finding and using research evidence to support evidence-informed policymaking. These are: 1. Does the approach to prioritisation make clear the timelines that have been set for addressing high-priority issues in different ways? 2. Does the approach incorporate explicit criteria for determining priorities? 3. Does the approach incorporate an explicit process for determining priorities? 4. Does the approach incorporate a communications strategy and a monitoring and evaluation plan?

  6. A Standardized Needs Assessment Tool to Inform the Curriculum Development Process for Pediatric Resuscitation Simulation-Based Education in Resource-Limited Settings

    Directory of Open Access Journals (Sweden)

    Nicole Shilkofski

    2018-02-01

    Full Text Available IntroductionUnder five mortality rates (UFMR remain high for children in low- and middle-income countries (LMICs in the developing world. Education for practitioners in these environments is a key factor to improve outcomes that will address United Nations Sustainable Development Goals 3 and 10 (good health and well being and reduced inequalities. In order to appropriately contextualize a curriculum using simulation, it is necessary to first conduct a needs assessment of the target learner population. The World Health Organization (WHO has published a tool to assess capacity for emergency and surgical care in LMICs that is adaptable to this goal.Materials and methodsThe WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was modified to assess pediatric resuscitation capacity in clinical settings in two LMICs: Uganda and Myanmar. Modifications included assessment of self-identified learning needs, current practices, and perceived epidemiology of disease burden in each clinical setting, in addition to assessment of pediatric resuscitation capacity in regard to infrastructure, procedures, equipment, and supplies. The modified tool was administered to 94 respondents from the two settings who were target learners of a proposed simulation-based curriculum in pediatric and neonatal resuscitation.ResultsInfectious diseases (respiratory illnesses and diarrheal disease were cited as the most common causes of pediatric deaths in both countries. Self-identified learning needs included knowledge and skill development in pediatric airway/breathing topics, as well as general resuscitation topics such as CPR and fluid resuscitation in shock. Equipment and supply availability varied substantially between settings, and critical shortages were identified in each setting. Current practices and procedures were often limited by equipment availability or infrastructural considerations.Discussion and conclusionEpidemiology of disease

  7. Scalable streaming tools for analyzing N-body simulations: Finding halos and investigating excursion sets in one pass

    Science.gov (United States)

    Ivkin, N.; Liu, Z.; Yang, L. F.; Kumar, S. S.; Lemson, G.; Neyrinck, M.; Szalay, A. S.; Braverman, V.; Budavari, T.

    2018-04-01

    Cosmological N-body simulations play a vital role in studying models for the evolution of the Universe. To compare to observations and make a scientific inference, statistic analysis on large simulation datasets, e.g., finding halos, obtaining multi-point correlation functions, is crucial. However, traditional in-memory methods for these tasks do not scale to the datasets that are forbiddingly large in modern simulations. Our prior paper (Liu et al., 2015) proposes memory-efficient streaming algorithms that can find the largest halos in a simulation with up to 109 particles on a small server or desktop. However, this approach fails when directly scaling to larger datasets. This paper presents a robust streaming tool that leverages state-of-the-art techniques on GPU boosting, sampling, and parallel I/O, to significantly improve performance and scalability. Our rigorous analysis of the sketch parameters improves the previous results from finding the centers of the 103 largest halos (Liu et al., 2015) to ∼ 104 - 105, and reveals the trade-offs between memory, running time and number of halos. Our experiments show that our tool can scale to datasets with up to ∼ 1012 particles while using less than an hour of running time on a single GPU Nvidia GTX 1080.

  8. Competing Values Framework: A useful tool to define the predominant culture in a maternity setting in Australia.

    Science.gov (United States)

    Adams, Catherine; Dawson, Angela; Foureur, Maralyn

    2017-04-01

    To identify the predominant culture of an organisation which could then assess readiness for change. An exploratory design using the Competing Values Framework (CVF) as a self-administered survey tool. The Maternity Unit in one Australian metropolitan tertiary referral hospital. All 120 clinicians (100 midwives and 20 obstetricians) employed in the maternity service were invited to participate; 26% responded. The identification of the predominant culture of an organisation to assess readiness for change prior to the implementation of a new policy. The predominant culture of this maternity unit, as described by those who responded to the survey, was one of hierarchy with a focus on rules and regulations and less focus on innovation, flexibility and teamwork. These results suggest that this unit did not have readiness to change. There is value in undertaking preparatory work to gain a better understanding of the characteristics of an organisation prior to designing and implementing change. This understanding can influence additional preliminary work that may be required to increase the readiness for change and therefore increase the opportunity for successful change. The CVF is a useful tool to identify the predominant culture and characteristics of an organisation that could influence the success of change. Copyright © 2016 Australian College of Midwives. All rights reserved.

  9. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    OpenAIRE

    Yu-Chi Lin; Tung-Kuang Wu; Shian-Chang Huang; Ying-Ru Meng; Wen-Yau Liang

    2011-01-01

    Due to the implicit characteristics of learning disabilities (LDs), the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN) and support vector machine (SVM) have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST)...

  10. The PTSD Practitioner Registry: An Innovative Tracking, Dissemination, and Support Tool for Providers in Military and Nonmilitary Settings

    Science.gov (United States)

    2016-10-01

    Military and Nonmilitary Settings PRINCIPAL INVESTIGATOR: Raymond C. Rosen, PhD CONTRACTING ORGANIZATION: New Englad Research Instituites , Inc...Partner organizations may have provided financial or in-kind support, supplied facilities or equipment, collaborated in the research, exchanged...location list country) Partner’s contribution to the project (identify one or more)  Financial support;  In-kind support (e.g., partner makes

  11. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, pcare desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between

  12. Analysis of metolachlor ethane sulfonic acid (MESA) chirality in groundwater: A tool for dating groundwater movement in agricultural settings.

    Science.gov (United States)

    Rice, Clifford P; McCarty, Gregory W; Bialek-Kalinski, Krystyna; Zabetakis, Kara; Torrents, Alba; Hapeman, Cathleen J

    2016-08-01

    To better address how much groundwater contributes to the loadings of pollutants from agriculture we developed a specific dating tool for groundwater residence times. This tool is based on metolachlor ethane sulfonic acid, which is a major soil metabolite of metolachlor. The chiral forms of metolachlor ethane sulfonic acid (MESA) and the chiral forms of metolachlor were examined over a 6-year period in samples of groundwater and water from a groundwater-fed stream in a riparian buffer zone. This buffer zone bordered cropland receiving annual treatments with metolachlor. Racemic (rac) metolachlor was applied for two years in the neighboring field, and subsequently S-metolachlor was used which is enriched by 88% with the S-enantiomer. Chiral analyses of the samples showed an exponential increase in abundance of the S-enantiomeric forms for MESA as a function of time for both the first order riparian buffer stream (R(2)=0.80) and for groundwater within the riparian buffer (R(2)=0.96). However, the S-enrichment values for metolachlor were consistently high indicating different delivery mechanisms for MESA and metolachlor. A mean residence time of 3.8years was determined for depletion of the initially-applied rac-metolachlor. This approach could be useful in dating groundwater and determining the effectiveness of conservation measures. A mean residence time of 3.8years was calculated for groundwater feeding a first-order stream by plotting the timed-decay for the R-enantiomer of metolachlor ethane sulfonic acid. Published by Elsevier B.V.

  13. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Science.gov (United States)

    Rostami, Paryaneh; Ashcroft, Darren M; Tully, Mary P

    2018-01-01

    Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives. Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of

  14. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Directory of Open Access Journals (Sweden)

    Paryaneh Rostami

    Full Text Available Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives.Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory.Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported.Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however

  15. A decision-making tool for exchange transfusions in infants with severe hyperbilirubinemia in resource-limited settings.

    Science.gov (United States)

    Olusanya, B O; Iskander, I F; Slusher, T M; Wennberg, R P

    2016-05-01

    Late presentation and ineffective phototherapy account for excessive rates of avoidable exchange transfusions (ETs) in many low- and middle-income countries. Several system-based constraints sometimes limit the ability to provide timely ETs for all infants at risk of kernicterus, thus necessitating a treatment triage to optimize available resources. This article proposes a practical priority-setting model for term and near-term infants requiring ET after the first 48 h of life. The proposed model combines plasma/serum bilirubin estimation, clinical signs of acute bilirubin encephalopathy and neurotoxicity risk factors for predicting the risk of kernicterus based on available evidence in the literature.

  16. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  17. Digital immunohistochemistry wizard: image analysis-assisted stereology tool to produce reference data set for calibration and quality control.

    Science.gov (United States)

    Plancoulaine, Benoît; Laurinaviciene, Aida; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital image analysis (DIA) enables better reproducibility of immunohistochemistry (IHC) studies. Nevertheless, accuracy of the DIA methods needs to be ensured, demanding production of reference data sets. We have reported on methodology to calibrate DIA for Ki67 IHC in breast cancer tissue based on reference data obtained by stereology grid count. To produce the reference data more efficiently, we propose digital IHC wizard generating initial cell marks to be verified by experts. Digital images of proliferation marker Ki67 IHC from 158 patients (one tissue microarray spot per patient) with an invasive ductal carcinoma of the breast were used. Manual data (mD) were obtained by marking Ki67-positive and negative tumour cells, using a stereological method for 2D object enumeration. DIA was used as an initial step in stereology grid count to generate the digital data (dD) marks by Aperio Genie and Nuclear algorithms. The dD were collected into XML files from the DIA markup images and overlaid on the original spots along with the stereology grid. The expert correction of the dD marks resulted in corrected data (cD). The percentages of Ki67 positive tumour cells per spot in the mD, dD, and cD sets were compared by single linear regression analysis. Efficiency of cD production was estimated based on manual editing effort. The percentage of Ki67-positive tumor cells was in very good agreement in the mD, dD, and cD sets: regression of cD from dD (R2=0.92) reflects the impact of the expert editing the dD as well as accuracy of the DIA used; regression of the cD from the mD (R2=0.94) represents the consistency of the DIA-assisted ground truth (cD) with the manual procedure. Nevertheless, the accuracy of detection of individual tumour cells was much lower: in average, 18 and 219 marks per spot were edited due to the Genie and Nuclear algorithm errors, respectively. The DIA-assisted cD production in our experiment saved approximately 2/3 of manual marking. Digital IHC wizard

  18. (Q)SAR tools for priority setting: A case study with printed paper and board food contact material substances.

    Science.gov (United States)

    Van Bossuyt, Melissa; Van Hoeck, Els; Raitano, Giuseppa; Manganelli, Serena; Braeken, Els; Ates, Gamze; Vanhaecke, Tamara; Van Miert, Sabine; Benfenati, Emilio; Mertens, Birgit; Rogiers, Vera

    2017-04-01

    Over the last years, more stringent safety requirements for an increasing number of chemicals across many regulatory fields (e.g. industrial chemicals, pharmaceuticals, food, cosmetics, …) have triggered the need for an efficient screening strategy to prioritize the substances of highest concern. In this context, alternative methods such as in silico (i.e. computational) techniques gain more and more importance. In the current study, a new prioritization strategy for identifying potentially mutagenic substances was developed based on the combination of multiple (quantitative) structure-activity relationship ((Q)SAR) tools. Non-evaluated substances used in printed paper and board food contact materials (FCM) were selected for a case study. By applying our strategy, 106 out of the 1723 substances were assigned 'high priority' as they were predicted mutagenic by 4 different (Q)SAR models. Information provided within the models allowed to identify 53 substances for which Ames mutagenicity prediction already has in vitro Ames test results. For further prioritization, additional support could be obtained by applying local i.e. specific models, as demonstrated here for aromatic azo compounds, typically found in printed paper and board FCM. The strategy developed here can easily be applied to other groups of chemicals facing the same need for priority ranking. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. From psychotherapy to e-therapy: the integration of traditional techniques and new communication tools in clinical settings.

    Science.gov (United States)

    Castelnuovo, Gianluca; Gaggioli, Andrea; Mantovani, Fabrizia; Riva, Giuseppe

    2003-08-01

    Technology is starting to influence psychological fields. In particular, computer-mediated communication (CMC) is providing new tools that can be fruitfully applied in psychotherapy. These new technologies do not substitute for traditional techniques and approaches but they could be used as integration in the clinical process, enhancing or making easier particular steps of it. This paper focuses on the concept of e-therapy as a new modality of helping people resolve life and relationship issues. It utilizes the power and convenience of the Internet to allow synchronous and asynchronous communication between patient and therapist. It is important to underline that e-therapy is not an alternative treatment, but a resource that can be added to traditional psychotherapy. The paper also discusses how different forms of CMC can be fruitfully applied in psychology and psychotherapy, by evaluating the effectiveness of them in the clinical practice. To enhance the diffusion of e-therapy, further research is needed to evaluate all the pros and cons.

  20. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    Science.gov (United States)

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Daye, Tony [Green Power Labs (GPL), San Diego, CA (United States)

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  2. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  3. Novel molecular diagnostic tools for malaria elimination: a review of options from the point of view of high-throughput and applicability in resource limited settings.

    Science.gov (United States)

    Britton, Sumudu; Cheng, Qin; McCarthy, James S

    2016-02-16

    As malaria transmission continues to decrease, an increasing number of countries will enter pre-elimination and elimination. To interrupt transmission, changes in control strategies are likely to require more accurate identification of all carriers of Plasmodium parasites, both symptomatic and asymptomatic, using diagnostic tools that are highly sensitive, high throughput and with fast turnaround times preferably performed in local health service settings. Currently available immunochromatographic lateral flow rapid diagnostic tests and field microscopy are unlikely to consistently detect infections at parasite densities less than 100 parasites/µL making them insufficiently sensitive for detecting all carriers. Molecular diagnostic platforms, such as PCR and LAMP, are currently available in reference laboratories, but at a cost both financially and in turnaround time. This review describes the recent progress in developing molecular diagnostic tools in terms of their capacity for high throughput and potential for performance in non-reference laboratories for malaria elimination.

  4. Mixed methods evaluation of a quality improvement and audit tool for nurse-to-nurse bedside clinical handover in ward settings.

    Science.gov (United States)

    Redley, Bernice; Waugh, Rachael

    2018-04-01

    Nurse bedside handover quality is influenced by complex interactions related to the content, processes used and the work environment. Audit tools are seldom tested in 'real' settings. Examine the reliability, validity and usability of a quality improvement tool for audit of nurse bedside handover. Naturalistic, descriptive, mixed-methods. Six inpatient wards at a single large not-for-profit private health service in Victoria, Australia. Five nurse experts and 104 nurses involved in 199 change-of-shift bedside handovers. A focus group with experts and pilot test were used to examine content and face validity, and usability of the handover audit tool. The tool was examined for inter-rater reliability and usability using observation audits of handovers across six wards. Data were collected in 2013-2014. Two independent observers for 72 audits demonstrated acceptable inter-observer agreement for 27 (77%) items. Reliability was weak for items examining the handover environment. Seventeen items were not observed reflecting gaps in practices. Across 199 observation audits, gaps in nurse bedside handover practice most often related to process and environment, rather than content items. Usability was impacted by high observer burden, familiarity and non-specific illustrative behaviours. The reliability and validity of most items to audit handover content was acceptable. Gaps in practices for process and environment items were identified. Context specific exemplars and reducing the items used at each handover audit can enhance usability. Further research is needed to develop context specific exemplars and undertake additional reliability testing using a wide range of handover settings. CONTRIBUTION OF THE PAPER. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  6. Diagnostic accuracy of WHO verbal autopsy tool for ascertaining causes of neonatal deaths in the urban setting of Pakistan: a hospital-based prospective study.

    Science.gov (United States)

    Soofi, Sajid Bashir; Ariff, Shabina; Khan, Ubaidullah; Turab, Ali; Khan, Gul Nawaz; Habib, Atif; Sadiq, Kamran; Suhag, Zamir; Bhatti, Zaid; Ahmed, Imran; Bhal, Rajiv; Bhutta, Zulfiqar Ahmed

    2015-10-05

    Globally, clinical certification of the cause of neonatal death is not commonly available in developing countries. Under such circumstances it is imperative to use available WHO verbal autopsy tool to ascertain causes of death for strategic health planning in countries where resources are limited and the burden of neonatal death is high. The study explores the diagnostic accuracy of WHO revised verbal autopsy tool for ascertaining the causes of neonatal deaths against reference standard diagnosis obtained from standardized clinical and supportive hospital data. All neonatal deaths were recruited between August 2006 -February 2008 from two tertiary teaching hospitals in Province Sindh, Pakistan. The reference standard cause of death was established by two senior pediatricians within 2 days of occurrence of death using the International Cause of Death coding system. For verbal autopsy, trained female community health worker interviewed mother or care taker of the deceased within 2-6 weeks of death using a modified WHO verbal autopsy tool. Cause of death was assigned by 2 trained pediatricians. The performance was assessed in terms of sensitivity and specificity. Out of 626 neonatal deaths, cause-specific mortality fractions for neonatal deaths were almost similar in both verbal autopsy and reference standard diagnosis. Sensitivity of verbal autopsy was more than 93% for diagnosing prematurity and 83.5% for birth asphyxia. However the verbal autopsy didn't have acceptable accuracy for diagnosing the congenital malformation 57%. The specificity for all five major causes of neonatal deaths was greater than 90%. The WHO revised verbal autopsy tool had reasonable validity in determining causes of neonatal deaths. The tool can be used in resource limited community-based settings where neonatal mortality rate is high and death certificates from hospitals are not available.

  7. Detection of urinary tract infection (UTI) in long-term care setting: Is the multireagent strip an adequate diagnostic tool?

    Science.gov (United States)

    Arinzon, Zeev; Peisakh, Alexander; Shuval, Ishay; Shabat, Shay; Berner, Yitshal N

    2009-01-01

    Urinary tract infection (UTI) is one of the most commonly diagnosed and treated infection in elderly residents of long-term care (LTC) setting, and most of them are asymptomatic. Early diagnosis and treatment especially in this group of patients is very important because even a brief delay contributes to mortality as well as to reduce functional and cognitive decline. The purpose of the present study was to determine the validity of multireagent strips (Multistix 10 SG, Bayer, UK) compared with standard urinalysis for the early detection of UTI in LTC elderly patients. Urine specimens were examined for the presence of leukocyte esterase (LE) activity as an indicator of pyuria, nitrite production as an indicator of bacteriuria, erythrocytes (RBC), and protein. The sensitivity, specificity, predictive value, kappa agreement, and likelihood ration were determined for each of the four dipstick parameters measurement separately, and in four combinations were calculated against the urine culture for the diagnosis of UTI and asymptomatic bacteriuria. Ninety-six patients aged 65 years and older with symptomatic UTI were compared with similar number, age, sex and comorbidity status matched patients with asymptomatic bacteriuria. In both groups, urinary culture results were compared with the results of multireagent strips. The multireagent strips results were evaluated for the presence of LE activity as an indicator of pyuria, nitrite production as an indicator of bacteriuria, RBC, and protein. All positive sticks results were evaluated as single parameter and in combination of them. Positive urine cultures were found in 71% (68/96) of the patients with symptomatic and in 60% (58/96; p>0.05) of patients with asymptomatic UTI. In patients with UTI, using multireagent strips kappa agreement for LE was 0.53, for nitrite was 0.14, and in combination of them was 0.31. Similar results were reported in patients with asymptomatic bacteriuria, 0.35, 0.23, and 0.35m. The detection of

  8. Development of the policy indicator checklist: a tool to identify and measure policies for calorie-dense foods and sugar-sweetened beverages across multiple settings.

    Science.gov (United States)

    Lee, Rebecca E; Hallett, Allen M; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O'Connor, Daniel P

    2015-05-01

    We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies.

  9. Disease management index of potential years of life lost as a tool for setting priorities in national disease control using OECD health data.

    Science.gov (United States)

    Jang, Sung-In; Nam, Jung-Mo; Choi, Jongwon; Park, Eun-Cheol

    2014-03-01

    Limited healthcare resources make it necessary to maximize efficiency in disease management at the country level by priority-setting according to disease burden. To make the best priority settings, it is necessary to measure health status and have standards for its judgment, as well as consider disease management trends among nations. We used 17 International Classification of Diseases (ICD) categories of potential years of life lost (YPLL) from Organization for Economic Co-operation and Development (OECD) health data for 2012, 37 disease diagnoses YPLL from OECD health data for 2009 across 22 countries and disability-adjusted life years (DALY) from the World Health Organization (WHO). We set a range of 1-1 for each YPLL per disease in a nation (position value for relative comparison, PARC). Changes over 5 years were also accounted for in this disease management index (disease management index, DMI). In terms of ICD categories, the DMI indicated specific areas for priority setting for different countries with regard to managing disease treatment and diagnosis. Our study suggests that DMI is a realistic index that reflects trend changes over the past 5 years to the present state, and PARC is an easy index for identifying relative status. Moreover, unlike existing indices, DMI and PARC make it easy to conduct multiple comparisons among countries and diseases. DMI and PARC are therefore useful tools for policy implications and for future studies incorporating them and other existing indexes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. De-MetaST-BLAST: a tool for the validation of degenerate primer sets and data mining of publicly available metagenomes.

    Directory of Open Access Journals (Sweden)

    Christopher A Gulvik

    Full Text Available Development and use of primer sets to amplify nucleic acid sequences of interest is fundamental to studies spanning many life science disciplines. As such, the validation of primer sets is essential. Several computer programs have been created to aid in the initial selection of primer sequences that may or may not require multiple nucleotide combinations (i.e., degeneracies. Conversely, validation of primer specificity has remained largely unchanged for several decades, and there are currently few available programs that allows for an evaluation of primers containing degenerate nucleotide bases. To alleviate this gap, we developed the program De-MetaST that performs an in silico amplification using user defined nucleotide sequence dataset(s and primer sequences that may contain degenerate bases. The program returns an output file that contains the in silico amplicons. When De-MetaST is paired with NCBI's BLAST (De-MetaST-BLAST, the program also returns the top 10 nr NCBI database hits for each recovered in silico amplicon. While the original motivation for development of this search tool was degenerate primer validation using the wealth of nucleotide sequences available in environmental metagenome and metatranscriptome databases, this search tool has potential utility in many data mining applications.

  11. Development of the Quality Improvement Minimum Quality Criteria Set (QI-MQCS): a tool for critical appraisal of quality improvement intervention publications.

    Science.gov (United States)

    Hempel, Susanne; Shekelle, Paul G; Liu, Jodi L; Sherwood Danz, Margie; Foy, Robbie; Lim, Yee-Wei; Motala, Aneesa; Rubenstein, Lisa V

    2015-12-01

    Valid, reliable critical appraisal tools advance quality improvement (QI) intervention impacts by helping stakeholders identify higher quality studies. QI approaches are diverse and differ from clinical interventions. Widely used critical appraisal instruments do not take unique QI features into account and existing QI tools (eg, Standards for QI Reporting Excellence) are intended for publication guidance rather than critical appraisal. This study developed and psychometrically tested a critical appraisal instrument, the QI Minimum Quality Criteria Set (QI-MQCS) for assessing QI-specific features of QI publications. Approaches to developing the tool and ensuring validity included a literature review, in-person and online survey expert panel input, and application to empirical examples. We investigated psychometric properties in a set of diverse QI publications (N=54) by analysing reliability measures and item endorsement rates and explored sources of disagreement between reviewers. The QI-MQCS includes 16 content domains to evaluate QI intervention publications: Organisational Motivation, Intervention Rationale, Intervention Description, Organisational Characteristics, Implementation, Study Design, Comparator Description, Data Sources, Timing, Adherence/Fidelity, Health Outcomes, Organisational Readiness, Penetration/Reach, Sustainability, Spread and Limitations. Median inter-rater agreement for QI-MQCS items was κ 0.57 (83% agreement). Item statistics indicated sufficient ability to differentiate between publications (median quality criteria met 67%). Internal consistency measures indicated coherence without excessive conceptual overlap (absolute mean interitem correlation=0.19). The critical appraisal instrument is accompanied by a user manual detailing What to consider, Where to look and How to rate. We developed a ready-to-use, valid and reliable critical appraisal instrument applicable to healthcare QI intervention publications, but recognise scope for

  12. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    Science.gov (United States)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  13. Hope in severe disease: a review of the literature on the construct and the tools for assessing hope in the psycho-oncologic setting.

    Science.gov (United States)

    Piccinelli, Claudia; Clerici, Carlo Alfredo; Veneroni, Laura; Ferrari, Andrea; Proserpio, Tullio

    2015-01-01

    Research on the topic of hope began a long time ago but, more recently, interest in this construct has focused mainly on the development of psychometric tools for its assessment. The 2 steps of the present article are defining the construct of hope by completing a preliminary review of the literature and analyzing the tools used to assess hope in the setting of oncologic medicine, conducting a systematic review of the existing scientific literature. Our study was conducted in 2 stages. The first stage involved a nonsystematic preliminary review of the literature, the second a systematic search in all the medical journals contained in the Medline database as of 2012. The literature identified at the first stage was divided according to several topical categories, i.e., theoretical, empirical, and clinical works on the construct of hope. In the second systematic search, we identified the main psychometric tools used to measure hope in the field of clinical oncology and assessed their validity. A total of 22 articles were identified. What emerged when we pooled the findings of our 2 lines of research was that, despite its broad theoretical definitions, the construct of hope can be broken down to a few constituent elements when hope is studied using currently available psychometric tools. In particular, these identified constituent elements were coping, spiritual well-being, quality of life, distress, and depression. The factors contained in the construct of hope include temporality, future, expectancy, motivation, and interconnectedness. The review of the scientific literature does not reveal a clear definition of hope. Multidisciplinary studies are needed to communicate different perspectives (medical, psychological, spiritual, theological) among each other for better definition of the constituent elements of hope in order to support the hope with specific interventions.

  14. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand.

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-12-01

    Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. An eco-friendly dengue vector control programme was successfully implemented in

  15. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-01-01

    Background Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Methodology Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Results Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. Conclusion An eco-friendly dengue vector control

  16. Food marketing in recreational sport settings in Canada: a cross-sectional audit in different policy environments using the Food and beverage Marketing Assessment Tool for Settings (FoodMATS).

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Storey, Kate; Mâsse, Louise C; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Children's recreational sport settings typically sell energy dense, low nutrient products; however, it is unknown whether the same types of food and beverages are also marketed in these settings. Understanding food marketing in sports settings is important because the food industry often uses the promotion of physical activity to justify their products. This study aimed to document the 'exposure' and 'power' of food marketing present in public recreation facilities in Canada and assess differences between provinces with and without voluntary provincial nutrition guidelines for recreation facilities. Food marketing was measured in 51 sites using the Food and beverage Marketing Assessment Tool for Settings (FoodMATS). The frequency and repetition ('exposure') of food marketing and the presence of select marketing techniques, including child-targeted, sports-related, size, and healthfulness ('power'), were assessed. Differences in 'exposure' and 'power' characteristics between sites in three guideline provinces (n = 34) and a non-guideline province (n = 17) were assessed using Pearson's Chi squared tests of homogeneity and Mann-Whitney U tests. Ninety-eight percent of sites had food marketing present. The frequency of food marketing per site did not differ between guideline and non-guideline provinces (median = 29; p = 0.576). Sites from guideline provinces had a significantly lower proportion of food marketing occasions that were "Least Healthy" (47.9%) than sites from the non-guideline province (73.5%; p food marketing techniques was significantly higher in sites from guideline provinces (9.5% and 10.9%, respectively), than in the non-guideline province (1.9% and 4.5% respectively; p values food marketing. Having voluntary provincial nutrition guidelines that recommend provision of healthier foods was not related to the frequency of food marketing in recreation facilities but was associated with less frequent marketing of unhealthy foods. Policy

  17. Development of a tool to measure person-centered maternity care in developing settings: validation in a rural and urban Kenyan population.

    Science.gov (United States)

    Afulani, Patience A; Diamond-Smith, Nadia; Golub, Ginger; Sudhinaraset, May

    2017-09-22

    Person-centered reproductive health care is recognized as critical to improving reproductive health outcomes. Yet, little research exists on how to operationalize it. We extend the literature in this area by developing and validating a tool to measure person-centered maternity care. We describe the process of developing the tool and present the results of psychometric analyses to assess its validity and reliability in a rural and urban setting in Kenya. We followed standard procedures for scale development. First, we reviewed the literature to define our construct and identify domains, and developed items to measure each domain. Next, we conducted expert reviews to assess content validity; and cognitive interviews with potential respondents to assess clarity, appropriateness, and relevance of the questions. The questions were then refined and administered in surveys; and survey results used to assess construct and criterion validity and reliability. The exploratory factor analysis yielded one dominant factor in both the rural and urban settings. Three factors with eigenvalues greater than one were identified for the rural sample and four factors identified for the urban sample. Thirty of the 38 items administered in the survey were retained based on the factors loadings and correlation between the items. Twenty-five items load very well onto a single factor in both the rural and urban sample, with five items loading well in either the rural or urban sample, but not in both samples. These 30 items also load on three sub-scales that we created to measure dignified and respectful care, communication and autonomy, and supportive care. The Chronbach alpha for the main scale is greater than 0.8 in both samples, and that for the sub-scales are between 0.6 and 0.8. The main scale and sub-scales are correlated with global measures of satisfaction with maternity services, suggesting criterion validity. We present a 30-item scale with three sub-scales to measure person

  18. Using the Lives Saved Tool (LiST) to Model mHealth Impact on Neonatal Survival in Resource-Limited Settings

    Science.gov (United States)

    Jo, Youngji; Labrique, Alain B.; Lefevre, Amnesty E.; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K.

    2014-01-01

    While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)–an evidence-based modeling software–to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives. PMID:25014008

  19. Using the lives saved tool (LiST to model mHealth impact on neonatal survival in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Youngji Jo

    Full Text Available While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST--an evidence-based modeling software--to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives.

  20. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  1. PeptideNavigator: An interactive tool for exploring large and complex data sets generated during peptide-based drug design projects.

    Science.gov (United States)

    Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J

    2018-01-01

    There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Hypogeal geological survey in the "Grotta del Re Tiberio" natural cave (Apennines, Italy): a valid tool for reconstructing the structural setting

    Science.gov (United States)

    Ghiselli, Alice; Merazzi, Marzio; Strini, Andrea; Margutti, Roberto; Mercuriali, Michele

    2011-06-01

    As karst systems are natural windows to the underground, speleology, combined with geological surveys, can be useful tools for helping understand the geological evolution of karst areas. In order to enhance the reconstruction of the structural setting in a gypsum karst area (Vena del Gesso, Romagna Apennines), a detailed analysis has been carried out on hypogeal data. Structural features (faults, fractures, tectonic foliations, bedding) have been mapped in the "Grotta del Re Tiberio" cave, in the nearby gypsum quarry tunnels and open pit benches. Five fracture systems and six fault systems have been identified. The fault systems have been further analyzed through stereographic projections and geometric-kinematic evaluations in order to reconstruct the relative chronology of these structures. This analysis led to the detection of two deformation phases. The results permitted linking of the hypogeal data with the surface data both at a local and regional scale. At the local scale, fracture data collected in the underground have been compared with previous authors' surface data coming from the quarry area. The two data sets show a very good correspondence, as every underground fracture system matches with one of the surface fracture system. Moreover, in the cave, a larger number of fractures belonging to each system could be mapped. At the regional scale, the two deformation phases detected can be integrated in the structural setting of the study area, thereby enhancing the tectonic interpretation of the area ( e.g., structures belonging to a new deformation phase, not reported before, have been identified underground). The structural detailed hypogeal survey has, thus, provided very useful data, both by integrating the existing information and revealing new data not detected at the surface. In particular, some small structures ( e.g., displacement markers and short fractures) are better preserved in the hypogeal environment than on the surface where the outcropping

  3. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    Science.gov (United States)

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and

  4. Evaluating the Implementation and Feasibility of a Web-Based Tool to Support Timely Identification and Care for the Frail Population in Primary Healthcare Settings

    Directory of Open Access Journals (Sweden)

    Beverley Lawson

    2017-07-01

    Full Text Available Background Understanding and addressing the needs of frail persons is an emerging health priority for Nova Scotia and internationally. Primary healthcare (PHC providers regularly encounter frail persons in their daily clinical work. However, routine identification and measurement of frailty is not standard practice and, in general, there is a lack of awareness about how to identify and respond to frailty. A web-based tool called the Frailty Portal was developed to aid in identifying, screening, and providing care for frail patients in PHC settings. In this study, we will assess the implementation feasibility and impact of the Frailty Portal to: (1 support increased awareness of frailty among providers and patients, (2 identify the degree of frailty within individual patients, and (3 develop and deliver actions to respond to frailtyl in community PHC practice. Methods This study will be approached using a convergent mixed method design where quantitative and qualitative data are collected concurrently, in this case, over a 9-month period, analyzed separately, and then merged to summarize, interpret and produce a more comprehensive understanding of the initiative’s feasibility and scalability. Methods will be informed by the ‘Implementing the Frailty Portal in Community Primary Care Practice’ logic model and questions will be guided by domains and constructs from an implementation science framework, the Consolidated Framework for Implementation Research (CFIR. Discussion The ‘Frailty Portal’ aims to improve access to, and coordination of, primary care services for persons experiencing frailty. It also aims to increase primary care providers’ ability to care for patients in the context of their frailty. Our goal is to help optimize care in the community by helping community providers gain the knowledge they may lack about frailty both in general and in their practice, support improved identification of frailty with the use of screening

  5. Combined Hydrologic (AGWA-KINEROS2) and Hydraulic (HEC2) Modeling for Post-Fire Runoff and Inundation Risk Assessment through a Set of Python Tools

    Science.gov (United States)

    Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.

    2016-12-01

    Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid

  6. Nurse manager and student nurse perceptions of the use of personal smartphones or tablets and the adjunct applications, as an educational tool in clinical settings.

    Science.gov (United States)

    McNally, George; Frey, Rosemary; Crossan, Michael

    2017-03-01

    Personally owned handheld referencing technology such as smartphones or tablets, and the adjunct applications (apps) that can be used on them, are becoming a part of everyday life for the New Zealand population. In common with the population at large, student nurses have embraced this technology since the advent of the Apple iPhone in 2010. Little is known internationally or in New Zealand about the way student nurses may apply personally owned handheld referencing technology to their education process. The perceptions of New Zealand nurse managers, toward personally owned handheld referencing technology, could not be located. Using a qualitative descriptive methodology, semi structured interviews were conducted with New Zealand student nurses (n = 13), and nurse managers (n = 5) about their perceptions of use of personally owned handheld referencing technology as an educational tool in clinical settings. A thematic analysis was conducted on the resulting text. Student nurses said they wanted to use their own handheld referencing technology to support clinical decisions. Nurse managers perceived the use of personally owned handheld referencing technology as unprofessional, and do not trust younger cohorts of student nurses to act ethically when using this technology. This research supports historical research findings from the student perspective about the usefulness of older hand held referencing devices to augment clinical decisions. However, due to perceptions held by nurse mangers regarding professional behaviour, safety and the perceived institutional costs of managing personally owned handheld referencing technology, the practice may remain problematic in the studied setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Utility of the heteroduplex assay (HDA) as a simple and cost-effective tool for the identification of HIV type 1 dual infections in resource-limited settings.

    Science.gov (United States)

    Powell, Rebecca L R; Urbanski, Mateusz M; Burda, Sherri; Nanfack, Aubin; Kinge, Thompson; Nyambi, Phillipe N

    2008-01-01

    The predominance of unique recombinant forms (URFs) of HIV-1 in Cameroon suggests that dual infection, the concomitant or sequential infection with genetically distinct HIV-1 strains, occurs frequently in this region; yet, identifying dual infection among large HIV cohorts in local, resource-limited settings is uncommon, since this generally relies on labor-intensive and costly sequencing methods. Consequently, there is a need to develop an effective, cost-efficient method appropriate to the developing world to identify these infections. In the present study, the heteroduplex assay (HDA) was used to verify dual or single infection status, as shown by traditional sequence analysis, for 15 longitudinally sampled study subjects from Cameroon. Heteroduplex formation, indicative of a dual infection, was identified for all five study subjects shown by sequence analysis to be dually infected. Conversely, heteroduplex formation was not detectable for all 10 HDA reactions of the singly infected study subjects. These results suggest that the HDA is a simple yet powerful and inexpensive tool for the detection of both intersubtype and intrasubtype dual infections, and that the HDA harbors significant potential for reliable, high-throughput screening for dual infection. As these infections and the recombinants they generate facilitate leaps in HIV-1 evolution, and may present major challenges for treatment and vaccine design, this assay will be critical for monitoring the continuing pandemic in regions of the world where HIV-1 viral diversity is broad.

  8. MCM generator: a Java-based tool for generating medical metadata.

    Science.gov (United States)

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.

  9. Reliability of Patient-Led Screening with the Malnutrition Screening Tool: Agreement between Patient and Health Care Professional Scores in the Cancer Care Ambulatory Setting.

    Science.gov (United States)

    Di Bella, Alexandra; Blake, Claire; Young, Adrienne; Pelecanos, Anita; Brown, Teresa

    2018-02-01

    The prevalence of malnutrition in patients with cancer is reported as high as 60% to 80%, and malnutrition is associated with lower survival, reduced response to treatment, and poorer functional status. The Malnutrition Screening Tool (MST) is a validated tool when administered by health care professionals; however, it has not been evaluated for patient-led screening. This study aims to assess the reliability of patient-led MST screening through assessment of inter-rater reliability between patient-led and dietitian-researcher-led screening and intra-rater reliability between an initial and a repeat patient screening. This cross-sectional study included 208 adults attending ambulatory cancer care services in a metropolitan teaching hospital in Queensland, Australia, in October 2016 (n=160 inter-rater reliability; n=48 intra-rater reliability measured in a separate sample). Primary outcome measures were MST risk categories (MST 0-1: not at risk, MST ≥2: at risk) as determined by screening completed by patients and a dietitian-researcher, patient test-retest screening, and patient acceptability. Percent and chance-corrected agreement (Cohen's kappa coefficient, κ) were used to determine agreement between patient-MST and dietitian-MST (inter-rater reliability) and MST completed by patient on admission to unit (patient-MSTA) and MST completed by patient 1 to 3 hours after completion of initial MST (patient-MSTB) (intra-rater reliability). High inter-rater reliability and intra-rater reliability were observed. Agreement between patient-MST and dietitian-MST was 96%, with "almost perfect" chance-adjusted agreement (κ=0.92, 95% CI 0.84 to 0.97). Agreement between repeated patient-MSTA and patient-MSTB was 94%, with "almost perfect" chance-adjusted agreement (κ=0.88, 95% CI 0.71 to 1.00). Based on dietitian-MST, 33% (n=53) of patients were identified as being at risk for malnutrition, and 40% of these reported not seeing a dietitian. Of 156 patients who provided

  10. Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    Science.gov (United States)

    DeLuccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24 hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  11. Neonatal Early Warning Tools for recognising and responding to clinical deterioration in neonates cared for in the maternity setting: A retrospective case-control study.

    Science.gov (United States)

    Paliwoda, Michelle; New, Karen; Bogossian, Fiona

    2016-09-01

    All newborns are at risk of deterioration as a result of failing to make the transition to extra uterine life. Signs of deterioration can be subtle and easily missed. It has been postulated that the use of an Early Warning Tool may assist clinicians in recognising and responding to signs of deterioration earlier in neonates, thereby preventing a serious adverse event. To examine whether observations from a Standard Observation Tool, applied to three neonatal Early Warning Tools, would hypothetically trigger an escalation of care more frequently than actual escalation of care using the Standard Observation Tool. A retrospective case-control study. A maternity unit in a tertiary public hospital in Australia. Neonates born in 2013 of greater than or equal to 34(+0) weeks gestation, admitted directly to the maternity ward from their birthing location and whose subsequent deterioration required admission to the neonatal unit, were identified as cases from databases of the study hospital. Each case was matched with three controls, inborn during the same period and who did not experience deterioration and neonatal unit admission. Clinical and physiological data recorded on a Standard Observation Tool, from time of admission to the maternity ward, for cases and controls were charted onto each of three Early Warning Tools. The primary outcome was whether the tool 'triggered an escalation of care'. Descriptive statistics (n, %, Mean and SD) were employed. Cases (n=26) comprised late preterm, early term and post-term neonates and matched by gestational age group with 3 controls (n=78). Overall, the Standard Observation Tool triggered an escalation of care for 92.3% of cases compared to the Early Warning Tools; New South Wales Health 80.8%, United Kingdom Newborn Early Warning Chart 57.7% and The Australian Capital Territory Neonatal Early Warning Score 11.5%. Subgroup analysis by gestational age found differences between the tools in hypothetically triggering an escalation of

  12. Development and validation of an observation tool for the assessment of nursing pain management practices in intensive care unit in a standardized clinical simulation setting.

    Science.gov (United States)

    Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne

    2014-12-01

    Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  13. Study of the photoproduction of the vector meson Φ(1020) and the hyperon Λ(1520) from the production threshold up to a photon energy of 2.65 GeV with SAPHIR

    International Nuclear Information System (INIS)

    Wiegers, B.

    2001-05-01

    The photoproduction of the vector meson φ(1020) and the hyperon Λ(1520) have been measured in the finale state pK + K - from their thresholds up to 2.65 GeV using the high duty-factor electron accelerator ELSA and the 4π-detectorsystem SAPHIR. The t-dependence of φ(1020)-production shows an exponential behavior as expected from diffractive production. s-channel helicity conservation can be seen in the decay angular distribution in the helicity frame. The decay angular distribution in the Gottfried-Jackson frame is not conformable with the exchange of a Pomeron in the t-channel. For the first time, differential cross sections of the Λ(1520) photoproduction from the threshold are measured. The production angular distribution and the decay angular distribution in the Gottfried-Jackson frame show a K * exchange in the t-channel. (orig.)

  14. Cardio-Thoracic Ratio Is Stable, Reproducible and Has Potential as a Screening Tool for HIV-1 Related Cardiac Disorders in Resource Poor Settings.

    Directory of Open Access Journals (Sweden)

    Hanif Esmail

    Full Text Available Cardiovascular disorders are common in HIV-1 infected persons in Africa and presentation is often insidious. Development of screening algorithms for cardiovascular disorders appropriate to a resource-constrained setting could facilitate timely referral. Cardiothoracic ratio (CTR on chest radiograph (CXR has been suggested as a potential screening tool but little is known about its reproducibility and stability. Our primary aim was to evaluate the stability and the inter-observer variability of CTR in HIV-1 infected outpatients. We further evaluated the prevalence of cardiomegaly (CTR≥0.5 and its relationship with other risk factors in this population.HIV-1 infected participants were identified during screening for a tuberculosis vaccine trial in Khayelitsha, South Africa between August 2011 and April 2012. Participants had a digital posterior-anterior CXR performed as well as history, examination and baseline observations. CXRs were viewed using OsiriX software and CTR calculated using digital callipers.450 HIV-1-infected adults were evaluated, median age 34 years (IQR 30-40 with a CD4 count 566/mm3 (IQR 443-724, 70% on antiretroviral therapy (ART. The prevalence of cardiomegaly was 12.7% (95% C.I. 9.6%-15.8%. CTR was calculated by a 2nd reader for 113 participants, measurements were highly correlated r = 0.95 (95% C.I. 0.93-0.97 and agreement of cardiomegaly substantial κ = 0.78 (95% C.I 0.61-0.95. CXR were repeated in 51 participants at 4-12 weeks, CTR measurements between the 2 time points were highly correlated r = 0.77 (95% C.I 0.68-0.88 and agreement of cardiomegaly excellent κ = 0.92 (95% C.I. 0.77-1. Participants with cardiomegaly had a higher median BMI (31.3; IQR 27.4-37.4 versus 26.9; IQR 23.2-32.4; p<0.0001 and median systolic blood pressure (130; IQR 121-141 versus 125; IQR 117-135; p = 0.01.CTR is a robust measurement, stable over time with substantial inter-observer agreement. A prospective study evaluating utility of CXR to

  15. Combining multi-criteria decision analysis and mini-health technology assessment: A funding decision-support tool for medical devices in a university hospital setting.

    Science.gov (United States)

    Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2016-02-01

    At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    Science.gov (United States)

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.

  17. Development of a screening tool predicting the transition from acute to chronic low back pain for patients in a GP setting: Protocol of a multinational prospective cohort study

    Directory of Open Access Journals (Sweden)

    Bajracharya Suraj

    2008-12-01

    Full Text Available Abstract Background Low back pain (LBP is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. Methods Patients with acute LBP will be recruited from clinics of general practitioners (GPs in New Zealand (NZ and Switzerland (CH. They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI; key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS. A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. Discussion This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our

  18. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  19. Single-item measures for depression and anxiety: Validation of the Screening Tool for Psychological Distress in an inpatient cardiology setting.

    Science.gov (United States)

    Young, Quincy-Robyn; Nguyen, Michelle; Roth, Susan; Broadberry, Ann; Mackay, Martha H

    2015-12-01

    Depression and anxiety are common among patients with cardiovascular disease (CVD) and confer significant cardiac risk, contributing to CVD morbidity and mortality. Unfortunately, due to the lack of screening tools that address the specific needs of hospitalized patients, few cardiac inpatient programs offer routine screening for these forms of psychological distress, despite recommendations to do so. The purpose of this study was to validate single-item measures for depression and anxiety among cardiac inpatients. Consecutive inpatients were recruited from the cardiology and cardiac surgery step-down units at a university-affiliated, quaternary-care hospital. Subjects completed a questionnaire that included: (a) demographics, (b) single-item-measures for depression and anxiety (from the Screening Tool for Psychological Distress (STOP-D)), and (c) Hospital Anxiety and Depression Scale (HADS). One hundred and five participants were recruited with a wide variety of cardiac diagnoses, having a mean age of 66 years, and 28% were women. Both STOP-D items were highly correlated with their corresponding validated measures and demonstrated robust receiver-operator characteristic curves. Severity scores on both items correlated well with established severity cut-off scores on the corresponding subscales of the HADS. The STOP-D is a self-administered, self-report measure using two independent items that provide severity scores for depression and anxiety. The tool performs very well compared with other previously validated measures. Requiring no additional scoring and being free, STOP-D offers a simple and valid method for identifying hospitalized cardiac patients who are experiencing psychological distress. This crucial first step triggers initiation of appropriate monitoring and intervention, thus reducing the likelihood of the adverse cardiac outcomes associated with psychological distress. © The European Society of Cardiology 2014.

  20. Predictive Validity of the STarT Back Tool for Risk of Persistent Disabling Back Pain in a U.S Primary Care Setting.

    Science.gov (United States)

    Suri, Pradeep; Delaney, Kristin; Rundell, Sean D; Cherkin, Daniel C

    2018-04-03

    To examine the predictive validity of the Subgrouping for Targeted Treatment (STarT Back) tool for classifying people with back pain into categories of low, medium, and high risk of persistent disabling back pain in U.S. primary care. Secondary analysis of data from participants receiving usual care in a randomized clinical trial. Primary care clinics. Adults (N = 1109) ≥18 years of age with back pain. Those with specific causes of back pain (pregnancy, disc herniation, vertebral fracture, spinal stenosis) and work-related injuries were not included. Not applicable. The original 9-item version of the STarT Back tool, administered at baseline, stratified patients by their risk (low, medium, high) of persistent disabling back pain (STarT Back risk group). Persistent disabling back pain was defined as Roland-Morris Disability Questionnaire scores of ≥7 at 6-month follow-up. The STarT Back risk group was a significant predictor of persistent disabling back pain (PSTarT Back risk groups successfully separated people with back pain into distinct categories of risk for persistent disabling back pain at 6-month follow-up in U.S. primary care. These results were very similar to those in the original STarT Back validation study. This validation study is a necessary first step toward identifying whether the entire STarT Back approach, including matched/targeted treatment, can be effectively used for primary care in the United States. Published by Elsevier Inc.

  1. The use and effectiveness of electronic clinical decision support tools in the ambulatory/primary care setting: a systematic review of the literature

    Directory of Open Access Journals (Sweden)

    Cathy Bryan

    2008-07-01

    Conclusion Although there is validation that CDSS has the potential to produce statistically significant improvement in outcomes, there is much variability among the types and methods of CDSS implementation and resulting effectiveness. As CDSS will likely continue to be at the forefront of the march toward effective standards-based care, more work needs to be done to determine effective implementation strategies for the use of CDSS across multiple settings and patient populations.

  2. Can the ICF osteoarthritis core set represent a future clinical tool in measuring functioning in persons with osteoarthritis undergoing hip and knee joint replacement?

    Science.gov (United States)

    Alviar, Maria Jenelyn; Olver, John; Pallant, Julie F; Brand, Caroline; de Steiger, Richard; Pirpiris, Marinis; Bucknill, Andrew; Khan, Fary

    2012-11-01

    To determine the dimensionality, reliability, model fit, adequacy of the qualifier levels, response patterns across different factors, and targeting of the International Classification of Functioning, Disability and Health (ICF) osteoarthritis core set categories in people with osteoarthritis undergoing hip and knee arthroplasty. The osteoarthritis core set was rated in 316 persons with osteoarthritis who were either in the pre-operative or within one year post-operative stage. Rasch analyses were performed using the RUMM 2030 program. Twelve of the 13 body functions categories and 13 of the 19 activity and participation categories had good model fit. The qualifiers displayed disordered thresholds necessitating rescoring. There was uneven spread of ICF categories across the full range of the patients' scores indicating off--targeting. Subtest analysis of the reduced ICF categories of body functions and activity and participation showed that the two components could be integrated to form one measure. The results suggest that it is possible to measure functioning using a unidimensional construct based on ICF osteoarthritis core set categories of body functions and activity and participation in this population. However, omission of some categories and reduction in qualifier levels are necessary. Further studies are needed to determine whether better targeting is achieved, particularly during the pre-operative and during the sub-acute care period.

  3. The use and impact of quality of life assessment tools in clinical care settings for cancer patients, with a particular emphasis on brain cancer: insights from a systematic review and stakeholder consultations.

    Science.gov (United States)

    King, Sarah; Exley, Josephine; Parks, Sarah; Ball, Sarah; Bienkowska-Gibbs, Teresa; MacLure, Calum; Harte, Emma; Stewart, Katherine; Larkin, Jody; Bottomley, Andrew; Marjanovic, Sonja

    2016-09-01

    Patient-reported data are playing an increasing role in health care. In oncology, data from quality of life (QoL) assessment tools may be particularly important for those with limited survival prospects, where treatments aim to prolong survival while maintaining or improving QoL. This paper examines the use and impact of using QoL measures on health care of cancer patients within a clinical setting, particularly those with brain cancer. It also examines facilitators and challenges, and provides implications for policy and practice. We conducted a systematic literature review, 15 expert interviews and a consultation at an international summit. The systematic review found no relevant intervention studies specifically in brain cancer patients, and after expanding our search to include other cancers, 15 relevant studies were identified. The evidence on the effectiveness of using QoL tools was inconsistent for patient management, but somewhat more consistent in favour of improving patient-physician communication. Interviews identified unharnessed potential and growing interest in QoL tool use and associated challenges to address. Our findings suggest that the use of QoL tools in cancer patients may improve patient-physician communication and have the potential to improve care, but the tools are not currently widely used in clinical practice (in brain cancer nor some other cancer contexts) although they are in clinical trials. There is a need for further research and stakeholder engagement on how QoL tools can achieve most impact across cancer and patient contexts. There is also a need for policy, health professional, research and patient communities to strengthen information exchange and debate, support awareness raising and provide training on tool design, use and interpretation.

  4. Effect of an interactive E-learning tool for delirium on patient and nursing outcomes in a geriatric hospital setting: findings of a before-after study.

    Science.gov (United States)

    Detroyer, Elke; Dobbels, Fabienne; Teodorczuk, Andrew; Deschodt, Mieke; Depaifve, Yves; Joosten, Etienne; Milisen, Koen

    2018-01-19

    Education of healthcare workers is a core element of multicomponent delirium strategies to improve delirium care and, consequently, patient outcomes. However, traditional educational strategies are notoriously difficult to implement. E-learning is hypothesised to be easier and more cost effective, but research evaluating effectiveness of delirium education through e-learning is scarce at present. Aim is to determine the effect of a nursing e-learning tool for delirium on: (1) in-hospital prevalence, duration and severity of delirium or mortality in hospitalized geriatric patients, and (2) geriatric nurses' knowledge and recognition regarding delirium. A before-after study in a sample of patients enrolled pre-intervention (non-intervention cohort (NIC); n = 81) and post-intervention (intervention cohort (IC); n = 79), and nurses (n = 17) of a geriatric ward (university hospital). The intervention included an information session about using the e-learning tool, which consisted of 11 e-modules incorporating development of knowledge and skills in the prevention, detection and management of delirium, and the completion of a delirium e-learning tool during a three-month period. Key patient outcomes included in-hospital prevalence and duration of delirium (Confusion Assessment Method), delirium severity (Delirium Index) and mortality (in-hospital; 12 months post-admission); key nurse outcomes included delirium knowledge (Delirium Knowledge Questionnaire) and recognition (Case vignettes). Logistic regression and linear mixed models were used to analyse patient data; Wilcoxon Signed Rank tests, McNemar's or paired t-tests for nursing data. No significant difference was found between the IC and NIC for in-hospital prevalence (21.5% versus 25.9%; p = 0.51) and duration of delirium (mean 4.2 ± SD 4.8 days versus 4.9 ± SD 4.8 days; p = 0.38). A trend towards a statistically significant lower delirium severity (IC versus NIC: difference estimate

  5. The Electronic Patient Reported Outcome Tool: Testing Usability and Feasibility of a Mobile App and Portal to Support Care for Patients With Complex Chronic Disease and Disability in Primary Care Settings

    Science.gov (United States)

    Gill, Ashlinder; Khan, Anum Irfan; Hans, Parminder Kaur; Kuluski, Kerry; Cott, Cheryl

    2016-01-01

    Background People experiencing complex chronic disease and disability (CCDD) face some of the greatest challenges of any patient population. Primary care providers find it difficult to manage multiple discordant conditions and symptoms and often complex social challenges experienced by these patients. The electronic Patient Reported Outcome (ePRO) tool is designed to overcome some of these challenges by supporting goal-oriented primary care delivery. Using the tool, patients and providers collaboratively develop health care goals on a portal linked to a mobile device to help patients and providers track progress between visits. Objectives This study tested the usability and feasibility of adopting the ePRO tool into a single interdisciplinary primary health care practice in Toronto, Canada. The Fit between Individuals, Fask, and Technology (FITT) framework was used to guide our assessment and explore whether the ePRO tool is: (1) feasible for adoption in interdisciplinary primary health care practices and (2) usable from both the patient and provider perspectives. This usability pilot is part of a broader user-centered design development strategy. Methods A 4-week pilot study was conducted in which patients and providers used the ePRO tool to develop health-related goals, which patients then monitored using a mobile device. Patients and providers collaboratively set goals using the system during an initial visit and had at least 1 follow-up visit at the end of the pilot to discuss progress. Focus groups and interviews were conducted with patients and providers to capture usability and feasibility measures. Data from the ePRO system were extracted to provide information regarding tool usage. Results Six providers and 11 patients participated in the study; 3 patients dropped out mainly owing to health issues. The remaining 8 patients completed 210 monitoring protocols, equal to over 1300 questions, with patients often answering questions daily. Providers and patients

  6. How to sell a condom? The impact of demand creation tools on male and female condom sales in resource limited settings.

    Science.gov (United States)

    Terris-Prestholt, Fern; Windmeijer, Frank

    2016-07-01

    Despite condoms being cheap and effective in preventing HIV, there remains an 8billion shortfall in condom use in risky sex-acts. Social marketing organisations apply private sector marketing approaches to sell public health products. This paper investigates the impact of marketing tools, including promotion and pricing, on demand for male and female condoms in 52 countries between 1997 and 2009. A static model differentiates drivers of demand between products, while a dynamic panel data estimator estimates their short- and long-run impacts. Products are not equally affected: female condoms are not affected by advertising, but highly affected by interpersonal communication and HIV prevalence. Price and promotion have significant short- and long-run effects, with female condoms far more sensitive to price than male condoms. The design of optimal distribution strategies for new and existing HIV prevention technologies must consider both product and target population characteristics. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Development of the Human Factors Skills for Healthcare Instrument: a valid and reliable tool for assessing interprofessional learning across healthcare practice settings.

    Science.gov (United States)

    Reedy, Gabriel B; Lavelle, Mary; Simpson, Thomas; Anderson, Janet E

    2017-10-01

    A central feature of clinical simulation training is human factors skills, providing staff with the social and cognitive skills to cope with demanding clinical situations. Although these skills are critical to safe patient care, assessing their learning is challenging. This study aimed to develop, pilot and evaluate a valid and reliable structured instrument to assess human factors skills, which can be used pre- and post-simulation training, and is relevant across a range of healthcare professions. Through consultation with a multi-professional expert group, we developed and piloted a 39-item survey with 272 healthcare professionals attending training courses across two large simulation centres in London, one specialising in acute care and one in mental health, both serving healthcare professionals working across acute and community settings. Following psychometric evaluation, the final 12-item instrument was evaluated with a second sample of 711 trainees. Exploratory factor analysis revealed a 12-item, one-factor solution with good internal consistency (α=0.92). The instrument had discriminant validity, with newly qualified trainees scoring significantly lower than experienced trainees ( t (98)=4.88, pSkills for Healthcare Instrument provides a reliable and valid method of assessing trainees' human factors skills self-efficacy across acute and mental health settings. This instrument has the potential to improve the assessment and evaluation of human factors skills learning in both uniprofessional and interprofessional clinical simulation training.

  8. Health system context and implementation of evidence-based practices-development and validation of the Context Assessment for Community Health (COACH) tool for low- and middle-income settings.

    Science.gov (United States)

    Bergström, Anna; Skeen, Sarah; Duc, Duong M; Blandon, Elmer Zelaya; Estabrooks, Carole; Gustavsson, Petter; Hoa, Dinh Thi Phuong; Källestål, Carina; Målqvist, Mats; Nga, Nguyen Thu; Persson, Lars-Åke; Pervin, Jesmin; Peterson, Stefan; Rahman, Anisur; Selling, Katarina; Squires, Janet E; Tomlinson, Mark; Waiswa, Peter; Wallin, Lars

    2015-08-15

    The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow

  9. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  10. Exploiting biospectroscopy as a novel screening tool for cervical cancer: towards a framework to validate its accuracy in a routine clinical setting.

    LENUS (Irish Health Repository)

    Purandare, Nikhil C

    2013-11-01

    Biospectroscopy is an emerging field that harnesses the platform of physical sciences with computational analysis in order to shed novel insights on biological questions. An area where this approach seems to have potential is in screening or diagnostic clinical settings, where there is an urgent need for new approaches to objectively interrogate large numbers of samples in an objective fashion with acceptable levels of sensitivity and specificity. This review outlines the benefits of biospectroscopy in screening for precancer lesions of the cervix due to its ability to separate different grades of dysplasia. It evaluates the feasibility of introducing this technique into cervical screening programs on the basis of its ability to identify biomarkers of progression within derived spectra (\\'biochemical‑cell fingerprints\\').

  11. Rationale and study protocol for a multi-component Health Information Technology (HIT) screening tool for depression and post-traumatic stress disorder in the primary care setting.

    Science.gov (United States)

    Biegler, Kelly; Mollica, Richard; Sim, Susan Elliott; Nicholas, Elisa; Chandler, Maria; Ngo-Metzger, Quyen; Paigne, Kittya; Paigne, Sompia; Nguyen, Danh V; Sorkin, Dara H

    2016-09-01

    The prevalence rate of depression in primary care is high. Primary care providers serve as the initial point of contact for the majority of patients with depression, yet, approximately 50% of cases remain unrecognized. The under-diagnosis of depression may be further exacerbated in limited English-language proficient (LEP) populations. Language barriers may result in less discussion of patients' mental health needs and fewer referrals to mental health services, particularly given competing priorities of other medical conditions and providers' time pressures. Recent advances in Health Information Technology (HIT) may facilitate novel ways to screen for depression and other mental health disorders in LEP populations. The purpose of this paper is to describe the rationale and protocol of a clustered randomized controlled trial that will test the effectiveness of an HIT intervention that provides a multi-component approach to delivering culturally competent, mental health care in the primary care setting. The HIT intervention has four components: 1) web-based provider training, 2) multimedia electronic screening of depression and PTSD in the patients' primary language, 3) Computer generated risk assessment scores delivered directly to the provider, and 4) clinical decision support. The outcomes of the study include assessing the potential of the HIT intervention to improve screening rates, clinical detection, provider initiation of treatment, and patient outcomes for depression and post-traumatic stress disorder (PTSD) among LEP Cambodian refugees who experienced war atrocities and trauma during the Khmer Rouge. This technology has the potential to be adapted to any LEP population in order to facilitate mental health screening and treatment in the primary care setting. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Cogena, a novel tool for co-expressed gene-set enrichment analysis, applied to drug repositioning and drug mode of action discovery.

    Science.gov (United States)

    Jia, Zhilong; Liu, Ying; Guan, Naiyang; Bo, Xiaochen; Luo, Zhigang; Barnes, Michael R

    2016-05-27

    Drug repositioning, finding new indications for existing drugs, has gained much recent attention as a potentially efficient and economical strategy for accelerating new therapies into the clinic. Although improvement in the sensitivity of computational drug repositioning methods has identified numerous credible repositioning opportunities, few have been progressed. Arguably the "black box" nature of drug action in a new indication is one of the main blocks to progression, highlighting the need for methods that inform on the broader target mechanism in the disease context. We demonstrate that the analysis of co-expressed genes may be a critical first step towards illumination of both disease pathology and mode of drug action. We achieve this using a novel framework, co-expressed gene-set enrichment analysis (cogena) for co-expression analysis of gene expression signatures and gene set enrichment analysis of co-expressed genes. The cogena framework enables simultaneous, pathway driven, disease and drug repositioning analysis. Cogena can be used to illuminate coordinated changes within disease transcriptomes and identify drugs acting mechanistically within this framework. We illustrate this using a psoriatic skin transcriptome, as an exemplar, and recover two widely used Psoriasis drugs (Methotrexate and Ciclosporin) with distinct modes of action. Cogena out-performs the results of Connectivity Map and NFFinder webservers in similar disease transcriptome analyses. Furthermore, we investigated the literature support for the other top-ranked compounds to treat psoriasis and showed how the outputs of cogena analysis can contribute new insight to support the progression of drugs into the clinic. We have made cogena freely available within Bioconductor or https://github.com/zhilongjia/cogena . In conclusion, by targeting co-expressed genes within disease transcriptomes, cogena offers novel biological insight, which can be effectively harnessed for drug discovery and

  13. Rationale and Study Protocol for a Multi-component Health Information Technology (HIT) Screening Tool for Depression and Post-traumatic Stress Disorder in the Primary Care Setting

    Science.gov (United States)

    Biegler, Kelly; Mollica, Richard; Sim, Susan Elliott; Nicholas, Elisa; Chandler, Maria; Ngo-Metzger, Quyen; Paigne, Kittya; Paigne, Sompia; Nguyen, Danh V.; Sorkin, Dara H.

    2016-01-01

    The prevalence rate of depression in primary care is high. Primary care providers serve as the initial point of contact for the majority of patients with depression, yet, approximately 50% of cases remain unrecognized. The under-diagnosis of depression may be further exacerbated in limited English-language proficient (LEP) populations. Language barriers may result in less discussion of patients’ mental health needs and fewer referrals to mental health services, particularly given competing priorities of other medical conditions and providers’ time pressures. Recent advances in Health Information Technology (HIT) may facilitate novel ways to screen for depression in LEP populations. The purpose of this paper is to describe the rationale and protocol of a clustered-randomized controlled trial that will test the effectiveness of an HIT intervention that provides a multi-component approach to delivering culturally competent, mental health care in the primary care setting. The HIT intervention has four components: 1) web-based provider training, 2) multimedia electronic screening of depression and PTSD in the patients’ primary language, 3) Computer generated risk assessment scores delivered directly to the provider, and 4) clinical decision support. The outcomes of the study include assessing the potential of the HIT intervention to improve screening rates, clinical detection, provider initiation of treatment, and patient outcomes for depression and PTSD among LEP Cambodian refugees who experienced war atrocities and trauma during the Khmer Rouge. This technology has the potential to be adapted to any LEP population in order to facilitate mental health screening and treatment in the primary care setting. PMID:27394385

  14. Measurement of the reactions γp→K+Λ and γp→K+Σ0 for photon energies up to 2.6 GeV with the SAPHIR detector at ELSA

    International Nuclear Information System (INIS)

    Glander, K.H.

    2003-02-01

    The reactions γp→K + Lambda and γp→K + Σ 0 were measured in the energy range from threshold up to a photon energy of 2.6 GeV. The data were taken with the SAPHIR detector at the electron stretcher facility ELSA. Results on cross sections and hyperon polarizations are presented as a function of kaon production angle and photon energy. The total cross section for Λ production shows a strong treshold enhancement wehreas the Σ 0 data have a maximum at about E γ =1.45 GeV. Cross sections together with their angular decompositions into Legendre polynomials suggest contributions from resonance production for both reactions. The K + Λ differential cross section is enhanced for backward produced kaons at E γ ∼1.45 GeV. This might be interpreted as contribution of a so called missing resonance D 13 (1895). In general, the induced polarization of Λ has negative values in the kaon forward direction and positive values in the backward direction. The magnitude varies with energy. The polarization of Σ 0 follows a similar angular and energy dependence as that of Λ, but with opposite sign. (orig.)

  15. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    Energy Technology Data Exchange (ETDEWEB)

    Cormier, Dallas [San Diego Gas & Electric, CA (United States); Edra, Sherwin [San Diego Gas & Electric, CA (United States); Espinoza, Michael [San Diego Gas & Electric, CA (United States); Daye, Tony [Green Power Labs, San Diego, CA (United States); Kostylev, Vladimir [Green Power Labs, San Diego, CA (United States); Pavlovski, Alexandre [Green Power Labs, San Diego, CA (United States); Jelen, Deborah [Electricore, Inc., Valencia, CA (United States)

    2014-12-29

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  16. Cardiovascular Disease Population Risk Tool (CVDPoRT): predictive algorithm for assessing CVD risk in the community setting. A study protocol.

    Science.gov (United States)

    Taljaard, Monica; Tuna, Meltem; Bennett, Carol; Perez, Richard; Rosella, Laura; Tu, Jack V; Sanmartin, Claudia; Hennessy, Deirdre; Tanuseputro, Peter; Lebenbaum, Michael; Manuel, Douglas G

    2014-10-23

    Recent publications have called for substantial improvements in the design, conduct, analysis and reporting of prediction models. Publication of study protocols, with prespecification of key aspects of the analysis plan, can help to improve transparency, increase quality and protect against increased type I error. Valid population-based risk algorithms are essential for population health planning and policy decision-making. The purpose of this study is to develop, evaluate and apply cardiovascular disease (CVD) risk algorithms for the population setting. The Ontario sample of the Canadian Community Health Survey (2001, 2003, 2005; 77,251 respondents) will be used to assess risk factors focusing on health behaviours (physical activity, diet, smoking and alcohol use). Incident CVD outcomes will be assessed through linkage to administrative healthcare databases (619,886 person-years of follow-up until 31 December 2011). Sociodemographic factors (age, sex, immigrant status, education) and mediating factors such as presence of diabetes and hypertension will be included as predictors. Algorithms will be developed using competing risks survival analysis. The analysis plan adheres to published recommendations for the development of valid prediction models to limit the risk of overfitting and improve the quality of predictions. Key considerations are fully prespecifying the predictor variables; appropriate handling of missing data; use of flexible functions for continuous predictors; and avoiding data-driven variable selection procedures. The 2007 and 2009 surveys (approximately 50,000 respondents) will be used for validation. Calibration will be assessed overall and in predefined subgroups of importance to clinicians and policymakers. This study has been approved by the Ottawa Health Science Network Research Ethics Board. The findings will be disseminated through professional and scientific conferences, and in peer-reviewed journals. The algorithm will be accessible

  17. HEDIS Limited Data Set

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Healthcare Effectiveness Data and Information Set (HEDIS) is a tool used by more than 90 percent of Americas health plans to measure performance on important...

  18. A systematic review and meta-analysis of the criterion validity of nutrition assessment tools for diagnosing protein-energy malnutrition in the older community setting (the MACRo study).

    Science.gov (United States)

    Marshall, Skye; Craven, Dana; Kelly, Jaimon; Isenring, Elizabeth

    2017-10-12

    Malnutrition is a significant barrier to healthy and independent ageing in older adults who live in their own homes, and accurate diagnosis is a key step in managing the condition. However, there has not been sufficient systematic review or pooling of existing data regarding malnutrition diagnosis in the geriatric community setting. The current paper was conducted as part of the MACRo (Malnutrition in the Ageing Community Review) Study and seeks to determine the criterion (concurrent and predictive) validity and reliability of nutrition assessment tools in making a diagnosis of protein-energy malnutrition in the general older adult community. A systematic literature review was undertaken using six electronic databases in September 2016. Studies in any language were included which measured malnutrition via a nutrition assessment tool in adults ≥65 years living in their own homes. Data relating to the predictive validity of tools were analysed via meta-analyses. GRADE was used to evaluate the body of evidence. There were 6412 records identified, of which 104 potentially eligible records were screened via full text. Eight papers were included; two which evaluated the concurrent validity of the Mini Nutritional Assessment (MNA) and Subjective Global Assessment (SGA) and six which evaluated the predictive validity of the MNA. The quality of the body of evidence for the concurrent validity of both the MNA and SGA was very low. The quality of the body of evidence for the predictive validity of the MNA in detecting risk of death was moderate (RR: 1.92 [95% CI: 1.55-2.39]; P < 0.00001; n = 2013 participants; n = 4 studies; I 2 : 0%). The quality of the body of evidence for the predictive validity of the MNA in detecting risk of poor physical function was very low (SMD: 1.02 [95%CI: 0.24-1.80]; P = 0.01; n = 4046 participants; n = 3 studies; I 2 :89%). Due to the small number of studies identified and no evaluation of the predictive validity of tools other than

  19. Counting SET-free sets

    OpenAIRE

    Harman, Nate

    2016-01-01

    We consider the following counting problem related to the card game SET: How many $k$-element SET-free sets are there in an $n$-dimensional SET deck? Through a series of algebraic reformulations and reinterpretations, we show the answer to this question satisfies two polynomiality conditions.

  20. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  1. Children and Young People-Mental Health Safety Assessment Tool (CYP-MH SAT) study: Protocol for the development and psychometric evaluation of an assessment tool to identify immediate risk of self-harm and suicide in children and young people (10–19 years) in acute paediatric hospital settings

    Science.gov (United States)

    Walker, Gemma M; Carter, Tim; Aubeeluck, Aimee; Witchell, Miranda; Coad, Jane

    2018-01-01

    Introduction Currently, no standardised, evidence-based assessment tool for assessing immediate self-harm and suicide in acute paediatric inpatient settings exists. Aim The aim of this study is to develop and test the psychometric properties of an assessment tool that identifies immediate risk of self-harm and suicide in children and young people (10–19 years) in acute paediatric hospital settings. Methods and analysis Development phase: This phase involved a scoping review of the literature to identify and extract items from previously published suicide and self-harm risk assessment scales. Using a modified electronic Delphi approach, these items will then be rated according to their relevance for assessment of immediate suicide or self-harm risk by expert professionals. Inclusion of items will be determined by 65%–70% consensus between raters. Subsequently, a panel of expert members will convene to determine the face validity, appropriate phrasing, item order and response format for the finalised items. Psychometric testing phase: The finalised items will be tested for validity and reliability through a multicentre, psychometric evaluation. Psychometric testing will be undertaken to determine the following: internal consistency, inter-rater reliability, convergent, divergent validity and concurrent validity. Ethics and dissemination Ethical approval was provided by the National Health Service East Midlands—Derby Research Ethics Committee (17/EM/0347) and full governance clearance received by the Health Research Authority and local participating sites. Findings from this study will be disseminated to professionals and the public via peer-reviewed journal publications, popular social media and conference presentations. PMID:29654046

  2. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in 't Veld, M. M.A.; Boogaard, S.A.A. van den

    2007-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation -meetings- by means of a set of

  3. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in't Veld, M.A.A.; Boogaard, S.A.A. van den

    2008-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation - meetings - by means of a set

  4. Algorithm for predicting death among older adults in the home care setting: study protocol for the Risk Evaluation for Support: Predictions for Elder-life in the Community Tool (RESPECT).

    Science.gov (United States)

    Hsu, Amy T; Manuel, Douglas G; Taljaard, Monica; Chalifoux, Mathieu; Bennett, Carol; Costa, Andrew P; Bronskill, Susan; Kobewka, Daniel; Tanuseputro, Peter

    2016-12-01

    Older adults living in the community often have multiple, chronic conditions and functional impairments. A challenge for healthcare providers working in the community is the lack of a predictive tool that can be applied to the broad spectrum of mortality risks observed and may be used to inform care planning. To predict survival time for older adults in the home care setting. The final mortality risk algorithm will be implemented as a web-based calculator that can be used by older adults needing care and by their caregivers. Open cohort study using the Resident Assessment Instrument for Home Care (RAI-HC) data in Ontario, Canada, from 1 January 2007 to 31 December 2013. The derivation cohort will consist of ∼437 000 older adults who had an RAI-HC assessment between 1 January 2007 and 31 December 2012. A split sample validation cohort will include ∼122 000 older adults with an RAI-HC assessment between 1 January and 31 December 2013. Predicted survival from the time of an RAI-HC assessment. All deaths (n≈245 000) will be ascertained through linkage to a population-based registry that is maintained by the Ministry of Health in Ontario. Proportional hazards regression will be estimated after assessment of assumptions. Predictors will include sociodemographic factors, social support, health conditions, functional status, cognition, symptoms of decline and prior healthcare use. Model performance will be evaluated for 6-month and 12-month predicted risks, including measures of calibration (eg, calibration plots) and discrimination (eg, c-statistics). The final algorithm will use combined development and validation data. Research ethics approval has been granted by the Sunnybrook Health Sciences Centre Review Board. Findings will be disseminated through presentations at conferences and in peer-reviewed journals. NCT02779309, Pre-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  5. Soft sets combined with interval valued intuitionistic fuzzy sets of type-2 and rough sets

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2015-03-01

    Full Text Available Fuzzy set theory, rough set theory and soft set theory are all mathematical tools dealing with uncertainties. The concept of type-2 fuzzy sets was introduced by Zadeh in 1975 which was extended to interval valued intuitionistic fuzzy sets of type-2 by the authors.This paper is devoted to the discussions of the combinations of interval valued intuitionistic sets of type-2, soft sets and rough sets.Three different types of new hybrid models, namely-interval valued intuitionistic fuzzy soft sets of type-2, soft rough interval valued intuitionistic fuzzy sets of type-2 and soft interval valued intuitionistic fuzzy rough sets of type-2 are proposed and their properties are derived.

  6. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2010-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  7. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2011-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  8. Breastfeeding assessment tools

    International Nuclear Information System (INIS)

    Bizouerne, Cécile; Kerac, Marko; Macgrath, Marie

    2014-01-01

    Full text: Breastfeeding plays a major role in reducing the global burden of child mortality and under-nutrition. Whilst many programmes aim to support breastfeeding and prevent feeding problems occurring, interventions are also needed once they have developed. In this situation, accurate assessment of a problem is critical to inform prognosis and enables tailored, appropriate treatment. The presentation will present a review, which aims to identify breastfeeding assessment tools/checklists for use in assessing malnourished infants in poor resource settings. The literature review identified 24 breastfeeding assessment tools, and 41 validation studies. Evidence underpinning most of the tools was mainly low quality, and conducted in high-income countries and hospital settings. The presentation will describe the main findings of the literature review and propose recommendations for improving existing tools in order to appropriately assess malnourished infants and enable early, appropriate intervention and treatment of malnutrition. (author)

  9. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Infection prevention and control measures and tools for the prevention of entry of carbapenem-resistant Enterobacteriaceae into healthcare settings: guidance from the European Centre for Disease Prevention and Control

    Directory of Open Access Journals (Sweden)

    A. P. Magiorakos

    2017-11-01

    Full Text Available Abstract Background Infections with carbapenem-resistant Enterobacteriaceae (CRE are increasingly being reported from patients in healthcare settings. They are associated with high patient morbidity, attributable mortality and hospital costs. Patients who are “at-risk” may be carriers of these multidrug-resistant Enterobacteriaceae (MDR-E. The purpose of this guidance is to raise awareness and identify the “at-risk” patient when admitted to a healthcare setting and to outline effective infection prevention and control measures to halt the entry and spread of CRE. Methods The guidance was created by a group of experts who were functioning independently of their organisations, during two meetings hosted by the European Centre for Disease Prevention and Control. A list of epidemiological risk factors placing patients “at-risk” for carriage with CRE was created by the experts. The conclusions of a systematic review on the prevention of spread of CRE, with the addition of expert opinion, were used to construct lists of core and supplemental infection prevention and control measures to be implemented for “at-risk” patients upon admission to healthcare settings. Results Individuals with the following profile are “at-risk” for carriage of CRE: a a history of an overnight stay in a healthcare setting in the last 12 months, b dialysis-dependent or cancer chemotherapy in the last 12 months, c known previous carriage of CRE in the last 12 months and d epidemiological linkage to a known carrier of a CRE. Core infection prevention and control measures that should be considered for all patients in healthcare settings were compiled. Preliminary supplemental measures to be implemented for “at-risk” patients on admission are: pre-emptive isolation, active screening for CRE, and contact precautions. Patients who are confirmed positive for CRE will need additional supplemental measures. Conclusions Strengthening the microbiological

  12. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  13. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  14. A Tool for Creating Regionally Calibrated High-Resolution Land Cover Data Sets for the West African Sahel: Using Machine Learning to Scale Up Hand-Classified Maps in a Data-Sparse Environment

    Science.gov (United States)

    Van Gordon, M.; Van Gordon, S.; Min, A.; Sullivan, J.; Weiner, Z.; Tappan, G. G.

    2017-12-01

    Using support vector machine (SVM) learning and high-accuracy hand-classified maps, we have developed a publicly available land cover classification tool for the West African Sahel. Our classifier produces high-resolution and regionally calibrated land cover maps for the Sahel, representing a significant contribution to the data available for this region. Global land cover products are unreliable for the Sahel, and accurate land cover data for the region are sparse. To address this gap, the U.S. Geological Survey and the Regional Center for Agriculture, Hydrology and Meteorology (AGRHYMET) in Niger produced high-quality land cover maps for the region via hand-classification of Landsat images. This method produces highly accurate maps, but the time and labor required constrain the spatial and temporal resolution of the data products. By using these hand-classified maps alongside SVM techniques, we successfully increase the resolution of the land cover maps by 1-2 orders of magnitude, from 2km-decadal resolution to 30m-annual resolution. These high-resolution regionally calibrated land cover datasets, along with the classifier we developed to produce them, lay the foundation for major advances in studies of land surface processes in the region. These datasets will provide more accurate inputs for food security modeling, hydrologic modeling, analyses of land cover change and climate change adaptation efforts. The land cover classification tool we have developed will be publicly available for use in creating additional West Africa land cover datasets with future remote sensing data and can be adapted for use in other parts of the world.

  15. Hydraulic release oil tool

    International Nuclear Information System (INIS)

    Mims, M.G.; Mueller, M.D.; Ehlinger, J.C.

    1992-01-01

    This patent describes a hydraulic release tool. It comprises a setting assembly; a coupling member for coupling to drill string or petroleum production components, the coupling member being a plurality of sockets for receiving the dogs in the extended position and attaching the coupling member the setting assembly; whereby the setting assembly couples to the coupling member by engagement of the dogs in the sockets of releases from and disengages the coupling member in movement of the piston from its setting to its reposition in response to a pressure in the body in exceeding the predetermined pressure; and a relief port from outside the body into its bore and means to prevent communication between the relief port and the bore of the body axially of the piston when the piston is in the setting position and to establish such communication upon movement of the piston from the setting position to the release position and reduce the pressure in the body bore axially of the piston, whereby the reduction of the pressure signals that the tool has released the coupling member

  16. Camelias: a tool for the management of a complex set of γ spectrometry stations for the on-line monitoring of a reactor severe accident simulation in the framework of the Phebus FP programme

    International Nuclear Information System (INIS)

    Pantera, L.; Cornu, B.

    2003-01-01

    During a PHEBUS PF experiment, the on-line fission product release measurement is carried out by a set of seven γ stations supervised by a computer network. These stations generate about 20000 spectra for each experiment. These spectra are obtained under complex experimental conditions (high count rate, constrained nuclear environment and large diversity of configuration). The objective is to perform a quantitative γ-spectrometry analysis. The possibly complex use of self-attenuation models as well as the large volume of data to be treated led its to consider a flexible and generic method to manage the whole set of stations. The γ station calibration is carried out on-site. For each case, the developed methodology strictly uses the same approach for collecting data during the calibration phase, for calculations and for the software environment: that is the CAMELIAS application. The on-site calibration methodology is based on the use of a linear reference source allowing to reduce the 3D calculations to 2D calculations to obtain the reference efficiency function and the collimation function. Additional calculations then allow computing the efficiency function associated with the actual source. To centralize and to guarantee the coherence of the large volume of data collected during the calibration phase, we use a relational database management system. The computation of the self-attenuation coefficients associated with the actual source is performed by considering a simple model of self-attenuation along a straight line. The method, which can be generalized, is further applied to the delayed analysis of samplings and experimental devices of all the programs managed by the laboratory. (authors)

  17. Authoring Tools

    Science.gov (United States)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  18. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  19. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  20. Computability and Representations of the Zero Set

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2008-01-01

    htmlabstractIn this note we give a new representation for closed sets under which the robust zero set of a function is computable. We call this representation the component cover representation. The computation of the zero set is based on topological index theory, the most powerful tool for finding

  1. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  2. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  3. Design tools

    Science.gov (United States)

    Anton TenWolde; Mark T. Bomberg

    2009-01-01

    Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...

  4. The histogramming tool hparse

    International Nuclear Information System (INIS)

    Nikulin, V.; Shabratova, G.

    2005-01-01

    A general-purpose package aimed to simplify the histogramming in the data analysis is described. The proposed dedicated language for writing the histogramming scripts provides an effective and flexible tool for definition of a complicated histogram set. The script is more transparent and much easier to maintain than corresponding C++ code. In the TTree analysis it could be a good complement to the TTreeViewer class: the TTreeViewer is used for choice of the required histogram/cut set, while the hparse enables one to generate a code for systematic analysis

  5. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  6. Long Term Care Minimum Data Set (MDS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Long-Term Care Minimum Data Set (MDS) is a standardized, primary screening and assessment tool of health status that forms the foundation of the comprehensive...

  7. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  8. UpSet: Visualization of Intersecting Sets

    Science.gov (United States)

    Lex, Alexander; Gehlenborg, Nils; Strobelt, Hendrik; Vuillemot, Romain; Pfister, Hanspeter

    2016-01-01

    Understanding relationships between sets is an important analysis task that has received widespread attention in the visualization community. The major challenge in this context is the combinatorial explosion of the number of set intersections if the number of sets exceeds a trivial threshold. In this paper we introduce UpSet, a novel visualization technique for the quantitative analysis of sets, their intersections, and aggregates of intersections. UpSet is focused on creating task-driven aggregates, communicating the size and properties of aggregates and intersections, and a duality between the visualization of the elements in a dataset and their set membership. UpSet visualizes set intersections in a matrix layout and introduces aggregates based on groupings and queries. The matrix layout enables the effective representation of associated data, such as the number of elements in the aggregates and intersections, as well as additional summary statistics derived from subset or element attributes. Sorting according to various measures enables a task-driven analysis of relevant intersections and aggregates. The elements represented in the sets and their associated attributes are visualized in a separate view. Queries based on containment in specific intersections, aggregates or driven by attribute filters are propagated between both views. We also introduce several advanced visual encodings and interaction methods to overcome the problems of varying scales and to address scalability. UpSet is web-based and open source. We demonstrate its general utility in multiple use cases from various domains. PMID:26356912

  9. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  10. Set-Pi: Set Membership pi-Calculus

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Mödersheim, Sebastian Alexander; Nielson, Flemming

    2015-01-01

    Communication protocols often rely on stateful mechanisms to ensure certain security properties. For example, counters and timestamps can be used to ensure authentication, or the security of communication can depend on whether a particular key is registered to a server or it has been revoked. Pro......Verif, like other state of the art tools for protocol analysis, achieves good performance by converting a formal protocol specification into a set of Horn clauses, that represent a monotonically growing set of facts that a Dolev-Yao attacker can derive from the system. Since this set of facts is not state...... method with three examples, a simple authentication protocol based on counters, a key registration protocol, and a model of the Yubikey security device....

  11. Study of emergency setting for urban facility using microsimulation tool

    Science.gov (United States)

    Campisi, Tiziana; Canale, Antonino; Tesoriere, Giovanni

    2017-11-01

    Today Public transport is growing not only in terms of high passenger capacity but also considering high efficiency and it has become one of the preferred alternatives to automobile travel. This is evident, as for example, in the case of airport terminal working and management. The same could be for Bus Transport station considering roadway. As a result, many railway stations experience high levels of pedestrian congestion especially during the morning and afternoon peak periods. Traditional design and evaluation procedures for pedestrian transit facilities aim to maintain a desirable Pedestrian Level-Of-Service (PLOS) for the individual pedestrian areas or sub precincts. More in general, transit facilities and their sub-precincts interact with one another so that pedestrian circulation might be better assessed from a broader systems perspective. Microsimulation packages that can model pedestrians (e.g. VISSIM-VISWALK) can be employed to assess these interactions. This research outlines a procedure for the potential implementation of pedestrian flow analysis in a bus/rail transit station using micro-simulation. Base model data requirements are identified which include static (facility layout and locations of temporary equipment) and dynamic data (pedestrian demand and public transport services). Possible model calibration criteria would be also identified. A VISSIM micro-simulation base model would be developed for one of the main Airport terminal in Sicily (Italy) for investigating proposed station operational and infrastructure changes. This case study provided a good example for the potential implementation of micro-simulation models in the analysis of pedestrian circulation.

  12. Minimal tool set for a prokaryotic circadian clock.

    Science.gov (United States)

    Schmelling, Nicolas M; Lehmann, Robert; Chaudhury, Paushali; Beck, Christian; Albers, Sonja-Verena; Axmann, Ilka M; Wiegard, Anika

    2017-07-21

    Circadian clocks are found in organisms of almost all domains including photosynthetic Cyanobacteria, whereby large diversity exists within the protein components involved. In the model cyanobacterium Synechococcus elongatus PCC 7942 circadian rhythms are driven by a unique KaiABC protein clock, which is embedded in a network of input and output factors. Homologous proteins to the KaiABC clock have been observed in Bacteria and Archaea, where evidence for circadian behavior in these domains is accumulating. However, interaction and function of non-cyanobacterial Kai-proteins as well as homologous input and output components remain mainly unclear. Using a universal BLAST analyses, we identified putative KaiC-based timing systems in organisms outside as well as variations within Cyanobacteria. A systematic analyses of publicly available microarray data elucidated interesting variations in circadian gene expression between different cyanobacterial strains, which might be correlated to the diversity of genome encoded clock components. Based on statistical analyses of co-occurrences of the clock components homologous to Synechococcus elongatus PCC 7942, we propose putative networks of reduced and fully functional clock systems. Further, we studied KaiC sequence conservation to determine functionally important regions of diverged KaiC homologs. Biochemical characterization of exemplary cyanobacterial KaiC proteins as well as homologs from two thermophilic Archaea demonstrated that kinase activity is always present. However, a KaiA-mediated phosphorylation is only detectable in KaiC1 orthologs. Our analysis of 11,264 genomes clearly demonstrates that components of the Synechococcus elongatus PCC 7942 circadian clock are present in Bacteria and Archaea. However, all components are less abundant in other organisms than Cyanobacteria and KaiA, Pex, LdpA, and CdpA are only present in the latter. Thus, only reduced KaiBC-based or even simpler, solely KaiC-based timing systems might exist outside of the cyanobacterial phylum, which might be capable of driving diurnal oscillations.

  13. Fuzzy sets, rough sets, multisets and clustering

    CERN Document Server

    Dahlbom, Anders; Narukawa, Yasuo

    2017-01-01

    This book is dedicated to Prof. Sadaaki Miyamoto and presents cutting-edge papers in some of the areas in which he contributed. Bringing together contributions by leading researchers in the field, it concretely addresses clustering, multisets, rough sets and fuzzy sets, as well as their applications in areas such as decision-making. The book is divided in four parts, the first of which focuses on clustering and classification. The second part puts the spotlight on multisets, bags, fuzzy bags and other fuzzy extensions, while the third deals with rough sets. Rounding out the coverage, the last part explores fuzzy sets and decision-making.

  14. Tools for Supporting Distributed Agile Project Planning

    Science.gov (United States)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  15. SETS reference manual

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1985-05-01

    The Set Equation Transformation System (SETS) is used to achieve the symbolic manipulation of Boolean equations. Symbolic manipulation involves changing equations from their original forms into more useful forms - particularly by applying Boolean identities. The SETS program is an interpreter which reads, interprets, and executes SETS user programs. The user writes a SETS user program specifying the processing to be achieved and submits it, along with the required data, for execution by SETS. Because of the general nature of SETS, i.e., the capability to manipulate Boolean equations regardless of their origin, the program has been used for many different kinds of analysis

  16. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  17. Spectral dimension in causal set quantum gravity

    International Nuclear Information System (INIS)

    Eichhorn, Astrid; Mizera, Sebastian

    2014-01-01

    We evaluate the spectral dimension in causal set quantum gravity by simulating random walks on causal sets. In contrast to other approaches to quantum gravity, we find an increasing spectral dimension at small scales. This observation can be connected to the nonlocality of causal set theory that is deeply rooted in its fundamentally Lorentzian nature. Based on its large-scale behaviour, we conjecture that the spectral dimension can serve as a tool to distinguish causal sets that approximate manifolds from those that do not. As a new tool to probe quantum spacetime in different quantum gravity approaches, we introduce a novel dimensional estimator, the causal spectral dimension, based on the meeting probability of two random walkers, which respect the causal structure of the quantum spacetime. We discuss a causal-set example, where the spectral dimension and the causal spectral dimension differ, due to the existence of a preferred foliation. (paper)

  18. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  19. Sets in Coq, Coq in Sets

    Directory of Open Access Journals (Sweden)

    Bruno Barras

    2010-01-01

    Full Text Available This work is about formalizing models of various type theories of the Calculus of Constructions family. Here we focus on set theoretical models. The long-term goal is to build a formal set theoretical model of the Calculus of Inductive Constructions, so we can be sure that Coq is consistent with the language used by most mathematicians.One aspect of this work is to axiomatize several set theories: ZF possibly with inaccessible cardinals, and HF, the theory of hereditarily finite sets. On top of these theories we have developped a piece of the usual set theoretical construction of functions, ordinals and fixpoint theory. We then proved sound several models of the Calculus of Constructions, its extension with an infinite hierarchy of universes, and its extension with the inductive type of natural numbers where recursion follows the type-based termination approach.The other aspect is to try and discharge (most of these assumptions. The goal here is rather to compare the theoretical strengths of all these formalisms. As already noticed by Werner, the replacement axiom of ZF in its general form seems to require a type-theoretical axiom of choice (TTAC.

  20. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  1. Tools and data services registry

    DEFF Research Database (Denmark)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a...

  2. Contingency diagrams as teaching tools

    OpenAIRE

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  3. Non-commutative tools for topological insulators

    International Nuclear Information System (INIS)

    Prodan, Emil

    2010-01-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  4. Invariant sets for Windows

    CERN Document Server

    Morozov, Albert D; Dragunov, Timothy N; Malysheva, Olga V

    1999-01-01

    This book deals with the visualization and exploration of invariant sets (fractals, strange attractors, resonance structures, patterns etc.) for various kinds of nonlinear dynamical systems. The authors have created a special Windows 95 application called WInSet, which allows one to visualize the invariant sets. A WInSet installation disk is enclosed with the book.The book consists of two parts. Part I contains a description of WInSet and a list of the built-in invariant sets which can be plotted using the program. This part is intended for a wide audience with interests ranging from dynamical

  5. Hierarchical sets: analyzing pangenome structure through scalable set visualizations

    Science.gov (United States)

    2017-01-01

    Abstract Motivation: The increase in available microbial genome sequences has resulted in an increase in the size of the pangenomes being analyzed. Current pangenome visualizations are not intended for the pangenome sizes possible today and new approaches are necessary in order to convert the increase in available information to increase in knowledge. As the pangenome data structure is essentially a collection of sets we explore the potential for scalable set visualization as a tool for pangenome analysis. Results: We present a new hierarchical clustering algorithm based on set arithmetics that optimizes the intersection sizes along the branches. The intersection and union sizes along the hierarchy are visualized using a composite dendrogram and icicle plot, which, in pangenome context, shows the evolution of pangenome and core size along the evolutionary hierarchy. Outlying elements, i.e. elements whose presence pattern do not correspond with the hierarchy, can be visualized using hierarchical edge bundles. When applied to pangenome data this plot shows putative horizontal gene transfers between the genomes and can highlight relationships between genomes that is not represented by the hierarchy. We illustrate the utility of hierarchical sets by applying it to a pangenome based on 113 Escherichia and Shigella genomes and find it provides a powerful addition to pangenome analysis. Availability and Implementation: The described clustering algorithm and visualizations are implemented in the hierarchicalSets R package available from CRAN (https://cran.r-project.org/web/packages/hierarchicalSets) Contact: thomasp85@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28130242

  6. The GEDI Performance Tool

    Science.gov (United States)

    Hancock, S.; Armston, J.; Tang, H.; Patterson, P. L.; Healey, S. P.; Marselis, S.; Duncanson, L.; Hofton, M. A.; Kellner, J. R.; Luthcke, S. B.; Sun, X.; Blair, J. B.; Dubayah, R.

    2017-12-01

    NASA's Global Ecosystem Dynamics Investigation will mount a multi-track, full-waveform lidar on the International Space Station (ISS) that is optimised for the measurement of forest canopy height and structure. GEDI will use ten laser tracks, two 10 mJ "power beams" and eight 5 mJ "coverage beams" to produce global (51.5oS to 51.5oN) maps of above ground biomass (AGB), canopy height, vegetation structure and other biophysical parameters. The mission has a requirement to generate a 1 km AGB map with 80% of pixels with ≤ 20% standard error or 20 Mg·ha-1, whichever is greater. To assess performance and compare to mission requirements, an end-to-end simulator has been developed. The simulator brings together tools to propagate the effects of measurement and sampling error on GEDI data products. The simulator allows us to evaluate the impact of instrument performance, ISS orbits, processing algorithms and losses of data that may occur due to clouds, snow, leaf-off conditions, and areas with an insufficient signal-to-noise ratio (SNR). By evaluating the consequences of operational decisions on GEDI data products, this tool provides a quantitative framework for decision-making and mission planning. Here we demonstrate the performance tool by using it to evaluate the trade-off between measurement and sampling error on the 1 km AGB data product. Results demonstrate that the use of coverage beams during the day (lowest GEDI SNR case) over very dense forests (>95% canopy cover) will result in some measurement bias. Omitting these low SNR cases increased the sampling error. Through this an SNR threshold for a given expected canopy cover can be set. The other applications of the performance tool are also discussed, such as assessing the impact of decisions made in the AGB modelling and signal processing stages on the accuracy of final data products.

  7. Hypergraphs combinatorics of finite sets

    CERN Document Server

    Berge, C

    1989-01-01

    Graph Theory has proved to be an extremely useful tool for solving combinatorial problems in such diverse areas as Geometry, Algebra, Number Theory, Topology, Operations Research and Optimization. It is natural to attempt to generalise the concept of a graph, in order to attack additional combinatorial problems. The idea of looking at a family of sets from this standpoint took shape around 1960. In regarding each set as a ``generalised edge'' and in calling the family itself a ``hypergraph'', the initial idea was to try to extend certain classical results of Graph Theory such as the theorems of Turán and König. It was noticed that this generalisation often led to simplification; moreover, one single statement, sometimes remarkably simple, could unify several theorems on graphs. This book presents what seems to be the most significant work on hypergraphs.

  8. Value Set Authority Center

    Data.gov (United States)

    U.S. Department of Health & Human Services — The VSAC provides downloadable access to all official versions of vocabulary value sets contained in the 2014 Clinical Quality Measures (CQMs). Each value set...

  9. Settings for Suicide Prevention

    Science.gov (United States)

    ... Suicide Populations Racial/Ethnic Groups Older Adults Adolescents LGBT Military/Veterans Men Effective Prevention Comprehensive Approach Identify ... Based Prevention Settings American Indian/Alaska Native Settings Schools Colleges and Universities Primary Care Emergency Departments Behavioral ...

  10. Tool path in torus tool CNC machining

    Directory of Open Access Journals (Sweden)

    XU Ying

    2016-10-01

    Full Text Available This paper is about tool path in torus tool CNC machining.The mathematical model of torus tool is established.The tool path planning algorithm is determined through calculation of the cutter location,boundary discretization,calculation of adjacent tool path and so on,according to the conversion formula,the cutter contact point will be converted to the cutter location point and then these points fit a toolpath.Lastly,the path planning algorithm is implemented by using Matlab programming.The cutter location points for torus tool are calculated by Matlab,and then fit these points to a toolpath.While using UG software,another tool path of free surface is simulated of the same data.It is drew compared the two tool paths that using torus tool is more efficient.

  11. Alternate superior Julia sets

    International Nuclear Information System (INIS)

    Yadav, Anju; Rani, Mamta

    2015-01-01

    Alternate Julia sets have been studied in Picard iterative procedures. The purpose of this paper is to study the quadratic and cubic maps using superior iterates to obtain Julia sets with different alternate structures. Analytically, graphically and computationally it has been shown that alternate superior Julia sets can be connected, disconnected and totally disconnected, and also fattier than the corresponding alternate Julia sets. A few examples have been studied by applying different type of alternate structures

  12. Performative Tools and Collaborative Learning

    DEFF Research Database (Denmark)

    Minder, Bettina; Lassen, Astrid Heidemann

    of performative tools used in transdisciplinary events for collaborative learning. The results of this single case study add to extant knowledge- and learning literature by providing the reader with a rich description of characteristics and learning functions of performative tools in transdisciplinary events......The use of performative tools can support collaborative learning across knowledge domains (i.e. science and practice), because they create new spaces for dialog. However, so far innovation literature provides little answers to the important discussion of how to describe the effects and requirements...... and a description of how they interrelate with the specific setting of such an event. Furthermore, they complement previous findings by relating performative tools to collaborative learning for knowledge intensive ideas....

  13. Sets, Planets, and Comets

    Science.gov (United States)

    Baker, Mark; Beltran, Jane; Buell, Jason; Conrey, Brian; Davis, Tom; Donaldson, Brianna; Detorre-Ozeki, Jeanne; Dibble, Leila; Freeman, Tom; Hammie, Robert; Montgomery, Julie; Pickford, Avery; Wong, Justine

    2013-01-01

    Sets in the game "Set" are lines in a certain four-dimensional space. Here we introduce planes into the game, leading to interesting mathematical questions, some of which we solve, and to a wonderful variation on the game "Set," in which every tableau of nine cards must contain at least one configuration for a player to pick up.

  14. Axiomatic set theory

    CERN Document Server

    Suppes, Patrick

    1972-01-01

    This clear and well-developed approach to axiomatic set theory is geared toward upper-level undergraduates and graduate students. It examines the basic paradoxes and history of set theory and advanced topics such as relations and functions, equipollence, finite sets and cardinal numbers, rational and real numbers, and other subjects. 1960 edition.

  15. Paired fuzzy sets

    DEFF Research Database (Denmark)

    Rodríguez, J. Tinguaro; Franco de los Ríos, Camilo; Gómez, Daniel

    2015-01-01

    In this paper we want to stress the relevance of paired fuzzy sets, as already proposed in previous works of the authors, as a family of fuzzy sets that offers a unifying view for different models based upon the opposition of two fuzzy sets, simply allowing the existence of different types...

  16. A strategy to improve priority setting in developing countries.

    Science.gov (United States)

    Kapiriri, Lydia; Martin, Douglas K

    2007-09-01

    Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. Priority setting in developing countries is fraught with uncertainty due to lack of credible information, weak priority setting institutions, and unclear priority setting processes. Efforts to improve priority setting in these contexts have focused on providing information and tools. In this paper we argue that priority setting is a value laden and political process, and although important, the available information and tools are not sufficient to address the priority setting challenges in developing countries. Additional complementary efforts are required. Hence, a strategy to improve priority setting in developing countries should also include: (i) capturing current priority setting practices, (ii) improving the legitimacy and capacity of institutions that set priorities, and (iii) developing fair priority setting processes.

  17. Elements of set theory

    CERN Document Server

    Enderton, Herbert B

    1977-01-01

    This is an introductory undergraduate textbook in set theory. In mathematics these days, essentially everything is a set. Some knowledge of set theory is necessary part of the background everyone needs for further study of mathematics. It is also possible to study set theory for its own interest--it is a subject with intruiging results anout simple objects. This book starts with material that nobody can do without. There is no end to what can be learned of set theory, but here is a beginning.

  18. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  19. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    , conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  20. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  1. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  2. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  3. Poster Abstract: Towards NILM for Industrial Settings

    DEFF Research Database (Denmark)

    Holmegaard, Emil; Kjærgaard, Mikkel Baun

    2015-01-01

    Industry consumes a large share of the worldwide electricity consumption. Disaggregated information about electricity consumption enables better decision-making and feedback tools to optimize electricity consumption. In industrial settings electricity loads consist of a variety of equipment, whic...... consumption for six months, at an industrial site. In this poster abstract we provide initial results for how industrial equipment challenge NILM algorithms. These results thereby open up for evaluating the use of NILM in industrial settings....

  4. Applications of Soft Sets in -Algebras

    Directory of Open Access Journals (Sweden)

    N. O. Alshehri

    2013-01-01

    Full Text Available In 1999, Molodtsov introduced the concept of soft set theory as a general mathematical tool for dealing with uncertainty and vagueness. In this paper, we apply the concept of soft sets to K-algebras and investigate some properties of Abelian soft K-algebras. We also introduce the concept of soft intersection K-algebras and investigate some of their properties.

  5. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, Jr., Cressie E.; Gorin, Andrew H.; Seals, Roland D.

    1994-01-01

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  6. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, C.E. Jr.; Gorin, A.H.; Seals, R.D.

    1994-11-22

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  7. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  8. Compilation Tool Chains and Intermediate Representations

    DEFF Research Database (Denmark)

    Mottin, Julien; Pacull, François; Keryell, Ronan

    2014-01-01

    In SMECY, we believe that an efficient tool chain could only be defined when the type of parallelism required by an application domain and the hardware architecture is fixed. Furthermore, we believe that once a set of tools is available, it is possible with reasonable effort to change hardware ar...

  9. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  10. Motif enrichment tool.

    Science.gov (United States)

    Blatti, Charles; Sinha, Saurabh

    2014-07-01

    The Motif Enrichment Tool (MET) provides an online interface that enables users to find major transcriptional regulators of their gene sets of interest. MET searches the appropriate regulatory region around each gene and identifies which transcription factor DNA-binding specificities (motifs) are statistically overrepresented. Motif enrichment analysis is currently available for many metazoan species including human, mouse, fruit fly, planaria and flowering plants. MET also leverages high-throughput experimental data such as ChIP-seq and DNase-seq from ENCODE and ModENCODE to identify the regulatory targets of a transcription factor with greater precision. The results from MET are produced in real time and are linked to a genome browser for easy follow-up analysis. Use of the web tool is free and open to all, and there is no login requirement. ADDRESS: http://veda.cs.uiuc.edu/MET/. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. SIRTF Tools for DIRT

    Science.gov (United States)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.

    2004-07-01

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS {http://dustem.astro.umd.edu}) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF. The models are based on the dust radiation transfer code of Wolfire & Cassinelli (1986) which accounts for multiple grain sizes and compositions. The model outputs are averaged over the instrument bands using the same weighting (νFν = constant) as the SIRTF data pipeline which allows the SIRTF data products to be compared directly with the model database. This work was supported in part by a NASA AISRP grant NAG 5-10751 and the SIRTF Legacy Science Program provided by NASA through an award issued by JPL under NASA contract 1407.

  12. The power tool

    International Nuclear Information System (INIS)

    HAYFIELD, J.P.

    1999-01-01

    POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment

  13. A course on Borel sets

    CERN Document Server

    Srivastava, S M

    1998-01-01

    The roots of Borel sets go back to the work of Baire [8]. He was trying to come to grips with the abstract notion of a function introduced by Dirich­ let and Riemann. According to them, a function was to be an arbitrary correspondence between objects without giving any method or procedure by which the correspondence could be established. Since all the specific functions that one studied were determined by simple analytic expressions, Baire delineated those functions that can be constructed starting from con­ tinuous functions and iterating the operation 0/ pointwise limit on a se­ quence 0/ functions. These functions are now known as Baire functions. Lebesgue [65] and Borel [19] continued this work. In [19], Borel sets were defined for the first time. In his paper, Lebesgue made a systematic study of Baire functions and introduced many tools and techniques that are used even today. Among other results, he showed that Borel functions coincide with Baire functions. The study of Borel sets got an impetus from...

  14. User manual for storage simulation construction set

    International Nuclear Information System (INIS)

    Sehgal, Anil; Volz, Richard A.

    1999-01-01

    The Storage Simulation Set (SSCS) is a tool for composing storage system models using Telegrip. It is an application written in C++ and motif. With this system, the models of a storage system can be composed rapidly and accurately. The aspects of the SSCS are described within this report

  15. The neutron porosity tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-01-01

    The report contains a review of available information on neutron porosity tools with the emphasis on dual thermal-neutron-detector porosity tools and epithermal-neutron-detector porosity tools. The general principle of such tools is discussed and theoretical models are very briefly reviewed. Available data on tool designs are summarized with special regard to the source-detector distance. Tool operational data, porosity determination and correction of measurements are briefly discussed. (author) 15 refs

  16. Preset pivotal tool holder

    Science.gov (United States)

    Asmanes, Charles

    1979-01-01

    A tool fixture is provided for precise pre-alignment of a radiused edge cutting tool in a tool holder relative to a fixed reference pivot point established on said holder about which the tool holder may be selectively pivoted relative to the fixture base member to change the contact point of the tool cutting edge with a workpiece while maintaining the precise same tool cutting radius relative to the reference pivot point.

  17. LHCb online infrastructure monitoring tools

    International Nuclear Information System (INIS)

    Granado Cardoso, L.; Gaspar, C.; Haen, C.; Neufeld, N.; Varela, F.; Galli, D.

    2012-01-01

    The Online System of the LHCb experiment at CERN is composed of a very large number of PCs: around 1500 in a CPU farm for performing the High Level Trigger; around 170 for the control system, running the SCADA system - PVSS; and several others for performing data monitoring, reconstruction, storage, and infrastructure tasks, like databases, etc. Some PCs run Linux, some run Windows but all of them need to be remotely controlled and monitored to make sure they are correctly running and to be able, for example, to reboot them whenever necessary. A set of tools was developed in order to centrally monitor the status of all PCs and PVSS Projects needed to run the experiment: a Farm Monitoring and Control (FMC) tool, which provides the lower level access to the PCs, and a System Overview Tool (developed within the Joint Controls Project - JCOP), which provides a centralized interface to the FMC tool and adds PVSS project monitoring and control. The implementation of these tools has provided a reliable and efficient way to manage the system, both during normal operations as well as during shutdowns, upgrades or maintenance operations. This paper will present the particular implementation of this tool in the LHCb experiment and the benefits of its usage in a large scale heterogeneous system

  18. Haar meager sets revisited

    Czech Academy of Sciences Publication Activity Database

    Doležal, Martin; Rmoutil, M.; Vejnar, B.; Vlasák, V.

    2016-01-01

    Roč. 440, č. 2 (2016), s. 922-939 ISSN 0022-247X Institutional support: RVO:67985840 Keywords : Haar meager set * Haar null set * Polish group Subject RIV: BA - General Mathematics Impact factor: 1.064, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022247X1600305X

  19. Setting goals in psychotherapy

    DEFF Research Database (Denmark)

    Emiliussen, Jakob; Wagoner, Brady

    2013-01-01

    The present study is concerned with the ethical dilemmas of setting goals in therapy. The main questions that it aims to answer are: who is to set the goals for therapy and who is to decide when they have been reached? The study is based on four semi-­‐structured, phenomenological interviews...

  20. Pseudo-set framing.

    Science.gov (United States)

    Barasz, Kate; John, Leslie K; Keenan, Elizabeth A; Norton, Michael I

    2017-10-01

    Pseudo-set framing-arbitrarily grouping items or tasks together as part of an apparent "set"-motivates people to reach perceived completion points. Pseudo-set framing changes gambling choices (Study 1), effort (Studies 2 and 3), giving behavior (Field Data and Study 4), and purchase decisions (Study 5). These effects persist in the absence of any reward, when a cost must be incurred, and after participants are explicitly informed of the arbitrariness of the set. Drawing on Gestalt psychology, we develop a conceptual account that predicts what will-and will not-act as a pseudo-set, and defines the psychological process through which these pseudo-sets affect behavior: over and above typical reference points, pseudo-set framing alters perceptions of (in)completeness, making intermediate progress seem less complete. In turn, these feelings of incompleteness motivate people to persist until the pseudo-set has been fulfilled. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Descriptive set theory

    CERN Document Server

    Moschovakis, YN

    1987-01-01

    Now available in paperback, this monograph is a self-contained exposition of the main results and methods of descriptive set theory. It develops all the necessary background material from logic and recursion theory, and treats both classical descriptive set theory and the effective theory developed by logicians.

  2. Possibility Fuzzy Soft Set

    Directory of Open Access Journals (Sweden)

    Shawkat Alkhazaleh

    2011-01-01

    Full Text Available We introduce the concept of possibility fuzzy soft set and its operation and study some of its properties. We give applications of this theory in solving a decision-making problem. We also introduce a similarity measure of two possibility fuzzy soft sets and discuss their application in a medical diagnosis problem.

  3. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  4. Large Data Set Mining

    NARCIS (Netherlands)

    Leemans, I.B.; Broomhall, Susan

    2017-01-01

    Digital emotion research has yet to make history. Until now large data set mining has not been a very active field of research in early modern emotion studies. This is indeed surprising since first, the early modern field has such rich, copyright-free, digitized data sets and second, emotion studies

  5. "Ready, Set, FLOW!"

    Science.gov (United States)

    Stroud, Wesley

    2018-01-01

    All educators want their classrooms to be inviting areas that support investigations. However, a common mistake is to fill learning spaces with items or objects that are set up by the teacher or are simply "for show." This type of setting, although it may create a comfortable space for students, fails to stimulate investigations and…

  6. Economic communication model set

    Science.gov (United States)

    Zvereva, Olga M.; Berg, Dmitry B.

    2017-06-01

    This paper details findings from the research work targeted at economic communications investigation with agent-based models usage. The agent-based model set was engineered to simulate economic communications. Money in the form of internal and external currencies was introduced into the models to support exchanges in communications. Every model, being based on the general concept, has its own peculiarities in algorithm and input data set since it was engineered to solve the specific problem. Several and different origin data sets were used in experiments: theoretic sets were estimated on the basis of static Leontief's equilibrium equation and the real set was constructed on the basis of statistical data. While simulation experiments, communication process was observed in dynamics, and system macroparameters were estimated. This research approved that combination of an agent-based and mathematical model can cause a synergetic effect.

  7. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  8. Tools for Understanding Identity

    Energy Technology Data Exchange (ETDEWEB)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael; Hodges, Duncan; Kim, Dee DH; Love, Oriana J.; Nurse, Jason R.; Pike, William A.; Scholtz, Jean

    2013-12-28

    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical model for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take

  9. Method for automation of tool preproduction

    Science.gov (United States)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  10. Construct Maps as a Foundation for Standard Setting

    Science.gov (United States)

    Wyse, Adam E.

    2013-01-01

    Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…

  11. The tools of mathematical reasoning

    CERN Document Server

    Lakins, Tamara J

    2016-01-01

    This accessible textbook gives beginning undergraduate mathematics students a first exposure to introductory logic, proofs, sets, functions, number theory, relations, finite and infinite sets, and the foundations of analysis. The book provides students with a quick path to writing proofs and a practical collection of tools that they can use in later mathematics courses such as abstract algebra and analysis. The importance of the logical structure of a mathematical statement as a framework for finding a proof of that statement, and the proper use of variables, is an early and consistent theme used throughout the book.

  12. Wilmar Planning Tool, VBA documentation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  13. Severe accident management guidelines tool

    International Nuclear Information System (INIS)

    Gutierrez Varela, Javier; Tanarro Onrubia, Augustin; Martinez Fanegas, Rafael

    2014-01-01

    Severe Accident is addressed by means of a great number of documents such as guidelines, calculation aids and diagnostic trees. The response methodology often requires the use of several documents at the same time while Technical Support Centre members need to assess the appropriate set of equipment within the adequate mitigation strategies. In order to facilitate the response, TECNATOM has developed SAMG TOOL, initially named GGAS TOOL, which is an easy to use computer program that clearly improves and accelerates the severe accident management. The software is designed with powerful features that allow the users to focus on the decision-making process. Consequently, SAMG TOOL significantly improves the severe accident training, ensuring a better response under a real situation. The software is already installed in several Spanish Nuclear Power Plants and trainees claim that the methodology can be followed easier with it, especially because guidelines, calculation aids, equipment information and strategies availability can be accessed immediately (authors)

  14. Wilmar Planning Tool, VBA documentation

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  15. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  16. Set theory essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Set Theory includes elementary logic, sets, relations, functions, denumerable and non-denumerable sets, cardinal numbers, Cantor's theorem, axiom of choice, and order relations.

  17. Setting Places at the Table

    Directory of Open Access Journals (Sweden)

    James R. Briscoe

    2016-12-01

    Full Text Available A recent survey by the National Endowment for the Arts found that only 2% of Americans listen to “Classical” music with regularity, and fewer practice or play art or historic music even once in a year. The rotating kaleidoscope of new technologies, repertories, interpretation, and cultural values can become not an ultimate bewilderment, a nail in the coffin of art and historic music, but a powerful tool for revitalizing how it engages persons of all age groups and how it can broaden its understanding. The table of musical places we set can respond to the narrative we carefully conceive for any condition at hand, for the student or scholar or layperson we address, for an intentional kaleidoscope of presentations. Such an attitude might let the other 98% discover art and historic music and see their lives mirrored and bettered.

  18. Geometrical setting of solid mechanics

    International Nuclear Information System (INIS)

    Fiala, Zdenek

    2011-01-01

    Highlights: → Solid mechanics within the Riemannian symmetric manifold GL (3, R)/O (3, R). → Generalized logarithmic strain. → Consistent linearization. → Incremental principle of virtual power. → Time-discrete approximation. - Abstract: The starting point in the geometrical setting of solid mechanics is to represent deformation process of a solid body as a trajectory in a convenient space with Riemannian geometry, and then to use the corresponding tools for its analysis. Based on virtual power of internal stresses, we show that such a configuration space is the (globally) symmetric space of symmetric positive-definite real matrices. From this unifying point of view, we shall analyse the logarithmic strain, the stress rate, as well as linearization and intrinsic integration of corresponding evolution equation.

  19. Lebesgue Sets Immeasurable Existence

    Directory of Open Access Journals (Sweden)

    Diana Marginean Petrovai

    2012-12-01

    Full Text Available It is well known that the notion of measure and integral were released early enough in close connection with practical problems of measuring of geometric figures. Notion of measure was outlined in the early 20th century through H. Lebesgue’s research, founder of the modern theory of measure and integral. It was developed concurrently a technique of integration of functions. Gradually it was formed a specific area todaycalled the measure and integral theory. Essential contributions to building this theory was made by a large number of mathematicians: C. Carathodory, J. Radon, O. Nikodym, S. Bochner, J. Pettis, P. Halmos and many others. In the following we present several abstract sets, classes of sets. There exists the sets which are not Lebesgue measurable and the sets which are Lebesgue measurable but are not Borel measurable. Hence B ⊂ L ⊂ P(X.

  20. Leadership set-up

    DEFF Research Database (Denmark)

    Thude, Bettina Ravnborg; Stenager, Egon; von Plessen, Christian

    2018-01-01

    . Findings: The study found that the leadership set-up did not have any clear influence on interdisciplinary cooperation, as all wards had a high degree of interdisciplinary cooperation independent of which leadership set-up they had. Instead, the authors found a relation between leadership set-up and leader...... could influence legitimacy. Originality/value: The study shows that leadership set-up is not the predominant factor that creates interdisciplinary cooperation; but rather, leader legitimacy also should be considered. Additionally, the study shows that leader legitimacy can be difficult to establish...... and that it cannot be taken for granted. This is something chief executive officers should bear in mind when they plan and implement new leadership structures. Therefore, it would also be useful to look more closely at how to achieve legitimacy in cases where the leader is from a different profession to the staff....

  1. General Paleoclimatology Data Sets

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Data of past climate and environment derived from unusual proxy evidence. Parameter keywords describe what was measured in this data set. Additional summary...

  2. Dutch Risk Assessment tools

    NARCIS (Netherlands)

    Venema, A.

    2015-01-01

    The ‘Risico- Inventarisatie- en Evaluatie-instrumenten’ is the name for the Dutch risk assessment (RA) tools. A RA tool can be used to perform a risk assessment including an evaluation of the identified risks. These tools were among the first online risk assessment tools developed in Europe. The

  3. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  4. Set theory and physics

    Energy Technology Data Exchange (ETDEWEB)

    Svozil, K. [Univ. of Technology, Vienna (Austria)

    1995-11-01

    Inasmuch as physical theories are formalizable, set theory provides a framework for theoretical physics. Four speculations about the relevance of set theoretical modeling for physics are presented: the role of transcendental set theory (i) in chaos theory, (ii) for paradoxical decompositions of solid three-dimensional objects, (iii) in the theory of effective computability (Church-Turing thesis) related to the possible {open_quotes}solution of supertasks,{close_quotes} and (iv) for weak solutions. Several approaches to set theory and their advantages and disadvantages for physical applications are discussed: Cantorian {open_quotes}naive{close_quotes} (i.e., nonaxiomatic) set theory, contructivism, and operationalism. In the author`s opinion, an attitude, of {open_quotes}suspended attention{close_quotes} (a term borrowed from psychoanalysis) seems most promising for progress. Physical and set theoretical entities must be operationalized wherever possible. At the same time, physicists should be open to {open_quotes}bizarre{close_quotes} or {open_quotes}mindboggling{close_quotes} new formalisms, which need not be operationalizable or testable at the time of their creation, but which may successfully lead to novel fields of phenomenology and technology.

  5. Setting conservation priorities.

    Science.gov (United States)

    Wilson, Kerrie A; Carwardine, Josie; Possingham, Hugh P

    2009-04-01

    A generic framework for setting conservation priorities based on the principles of classic decision theory is provided. This framework encapsulates the key elements of any problem, including the objective, the constraints, and knowledge of the system. Within the context of this framework the broad array of approaches for setting conservation priorities are reviewed. While some approaches prioritize assets or locations for conservation investment, it is concluded here that prioritization is incomplete without consideration of the conservation actions required to conserve the assets at particular locations. The challenges associated with prioritizing investments through time in the face of threats (and also spatially and temporally heterogeneous costs) can be aided by proper problem definition. Using the authors' general framework for setting conservation priorities, multiple criteria can be rationally integrated and where, how, and when to invest conservation resources can be scheduled. Trade-offs are unavoidable in priority setting when there are multiple considerations, and budgets are almost always finite. The authors discuss how trade-offs, risks, uncertainty, feedbacks, and learning can be explicitly evaluated within their generic framework for setting conservation priorities. Finally, they suggest ways that current priority-setting approaches may be improved.

  6. New QC 7 tools

    International Nuclear Information System (INIS)

    1982-03-01

    This book tells of new QC with 7 tools which includes TQC and new QC with 7 tools which is for better propel, what is QC method to think? what is new QC 7 tool ? like KJ law, PDPC law, arrow and diagram law, and matrix diagram law, application of new QC 7 tools such as field to apply, application of new QC 7 tools for policy management the method of new QC 7 tools including related regulations KJ law, matrix and data analysis, PDPC law and education and introduction of new QC 7 tools.

  7. Automated Experiments on Ad Privacy Settings

    Directory of Open Access Journals (Sweden)

    Datta Amit

    2015-04-01

    Full Text Available To partly address people’s concerns over web tracking, Google has created the Ad Settings webpage to provide information about and some choice over the profiles Google creates on users. We present AdFisher, an automated tool that explores how user behaviors, Google’s ads, and Ad Settings interact. AdFisher can run browser-based experiments and analyze data using machine learning and significance tests. Our tool uses a rigorous experimental design and statistical analysis to ensure the statistical soundness of our results. We use AdFisher to find that the Ad Settings was opaque about some features of a user’s profile, that it does provide some choice on ads, and that these choices can lead to seemingly discriminatory ads. In particular, we found that visiting webpages associated with substance abuse changed the ads shown but not the settings page. We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. We cannot determine who caused these findings due to our limited visibility into the ad ecosystem, which includes Google, advertisers, websites, and users. Nevertheless, these results can form the starting point for deeper investigations by either the companies themselves or by regulatory bodies.

  8. Quantum mechanics over sets

    Science.gov (United States)

    Ellerman, David

    2014-03-01

    In models of QM over finite fields (e.g., Schumacher's ``modal quantum theory'' MQT), one finite field stands out, Z2, since Z2 vectors represent sets. QM (finite-dimensional) mathematics can be transported to sets resulting in quantum mechanics over sets or QM/sets. This gives a full probability calculus (unlike MQT with only zero-one modalities) that leads to a fulsome theory of QM/sets including ``logical'' models of the double-slit experiment, Bell's Theorem, QIT, and QC. In QC over Z2 (where gates are non-singular matrices as in MQT), a simple quantum algorithm (one gate plus one function evaluation) solves the Parity SAT problem (finding the parity of the sum of all values of an n-ary Boolean function). Classically, the Parity SAT problem requires 2n function evaluations in contrast to the one function evaluation required in the quantum algorithm. This is quantum speedup but with all the calculations over Z2 just like classical computing. This shows definitively that the source of quantum speedup is not in the greater power of computing over the complex numbers, and confirms the idea that the source is in superposition.

  9. The Model Confidence Set

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.

    The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS......, beyond the comparison of models. We apply the MCS procedure to two empirical problems. First, we revisit the inflation forecasting problem posed by Stock and Watson (1999), and compute the MCS for their set of inflation forecasts. Second, we compare a number of Taylor rule regressions and determine...... the MCS of the best in terms of in-sample likelihood criteria....

  10. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have ...... approach is based on ecological and whole-systems thinking, and stipulates important principles and values of integration, participation, empowerment, context and knowledge-based development....... further developed the setting approach in an effort to harmonise it with contemporary realities (and complexities) of health promotion and public health action. The paper introduces a modified concept, the supersetting approach, which builds on the optimised use of diverse and valuable resources embedded...... in local community settings and on the strengths of social interaction and local ownership as drivers of change processes. Interventions based on a supersetting approach are first and foremost characterised by being integrated, but also participatory, empowering, context-sensitive and knowledge...

  11. Set theory and logic

    CERN Document Server

    Stoll, Robert R

    1979-01-01

    Set Theory and Logic is the result of a course of lectures for advanced undergraduates, developed at Oberlin College for the purpose of introducing students to the conceptual foundations of mathematics. Mathematics, specifically the real number system, is approached as a unity whose operations can be logically ordered through axioms. One of the most complex and essential of modern mathematical innovations, the theory of sets (crucial to quantum mechanics and other sciences), is introduced in a most careful concept manner, aiming for the maximum in clarity and stimulation for further study in

  12. Nonmeasurable sets and functions

    CERN Document Server

    Kharazishvili, Alexander

    2004-01-01

    The book is devoted to various constructions of sets which are nonmeasurable with respect to invariant (more generally, quasi-invariant) measures. Our starting point is the classical Vitali theorem stating the existence of subsets of the real line which are not measurable in the Lebesgue sense. This theorem stimulated the development of the following interesting topics in mathematics:1. Paradoxical decompositions of sets in finite-dimensional Euclidean spaces;2. The theory of non-real-valued-measurable cardinals;3. The theory of invariant (quasi-invariant)extensions of invariant (quasi-invaria

  13. Why quasi-sets?

    Directory of Open Access Journals (Sweden)

    Décio Krause

    2002-11-01

    Full Text Available Quasi-set theory was developed to deal with collections of indistinguishable objects. In standard mathematics, there are no such kind of entities, for indistinguishability (agreement with respect to all properties entails numerical identity. The main motivation underlying such a theory is of course quantum physics, for collections of indistinguishable (’identical’ in the physicists’ jargon particles cannot be regarded as ’sets’ of standard set theories, which are collections of distinguishable objects. In this paper, a rationale for the development of such a theory is presented, motivated by Heinz Post’s claim that indistinguishability ofquantum entities should be attributed ’right at the start’.

  14. Combinatorics of finite sets

    CERN Document Server

    Anderson, Ian

    2011-01-01

    Coherent treatment provides comprehensive view of basic methods and results of the combinatorial study of finite set systems. The Clements-Lindstrom extension of the Kruskal-Katona theorem to multisets is explored, as is the Greene-Kleitman result concerning k-saturated chain partitions of general partially ordered sets. Connections with Dilworth's theorem, the marriage problem, and probability are also discussed. Each chapter ends with a helpful series of exercises and outline solutions appear at the end. ""An excellent text for a topics course in discrete mathematics."" - Bulletin of the Ame

  15. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface......Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  16. Nanocomposites for Machining Tools

    Directory of Open Access Journals (Sweden)

    Daria Sidorenko

    2017-10-01

    Full Text Available Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance.

  17. Tool grinding machine

    Science.gov (United States)

    Dial, Sr., Charles E.

    1980-01-01

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thickness may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  18. Improved tool grinding machine

    Science.gov (United States)

    Dial, C.E. Sr.

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thicknesses may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  19. Separate tools or tool kits: an exploratory study of engineers' preferences

    NARCIS (Netherlands)

    Vliegen, Ingrid; Kleingeld, P.A.M.; van Houtum, Geert-Jan

    2010-01-01

    This paper describes an exploratory study of aspects that should be taken into consideration when optimizing tool kits, i.e. cases containing sets of tools that are used by service engineers for corrective maintenance. The study was carried out among service engineers of an Original Equipment

  20. Prices and Price Setting

    NARCIS (Netherlands)

    R.P. Faber (Riemer)

    2010-01-01

    textabstractThis thesis studies price data and tries to unravel the underlying economic processes of why firms have chosen these prices. It focuses on three aspects of price setting. First, it studies whether the existence of a suggested price has a coordinating effect on the prices of firms.

  1. Cobham recursive set functions

    Czech Academy of Sciences Publication Activity Database

    Beckmann, A.; Buss, S.; Friedman, S.-D.; Müller, M.; Thapen, Neil

    2016-01-01

    Roč. 167, č. 3 (2016), s. 335-369 ISSN 0168-0072 R&D Projects: GA ČR GBP202/12/G061 Institutional support: RVO:67985840 Keywords : set function * polynomial time * Cobham recursion Subject RIV: BA - General Mathematics Impact factor: 0.647, year: 2016 http://www.sciencedirect.com/science/article/pii/S0168007215001293

  2. SET-Routes programme

    CERN Multimedia

    Marietta Schupp, EMBL Photolab

    2008-01-01

    Dr Sabine Hentze, specialist in human genetics, giving an Insight Lecture entitled "Human Genetics – Diagnostics, Indications and Ethical Issues" on 23 September 2008 at EMBL Heidelberg. Activities in a achool in Budapest during a visit of Angela Bekesi, Ambassadors for the SET-Routes programme.

  3. The Crystal Set

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2014-01-01

    In past issues of this journal, the late H. R. Crane wrote a long series of articles under the running title of "How Things Work." In them, Dick dealt with many questions that physics teachers asked themselves, but did not have the time to answer. This article is my attempt to work through the physics of the crystal set, which I thought…

  4. State-set branching

    DEFF Research Database (Denmark)

    Jensen, Rune Møller; Veloso, Manuela M.; Bryant, Randal E.

    2008-01-01

    In this article, we present a framework called state-set branching that combines symbolic search based on reduced ordered Binary Decision Diagrams (BDDs) with best-first search, such as A* and greedy best-first search. The framework relies on an extension of these algorithms from expanding a sing...

  5. Generalized rough sets

    International Nuclear Information System (INIS)

    Rady, E.A.; Kozae, A.M.; Abd El-Monsef, M.M.E.

    2004-01-01

    The process of analyzing data under uncertainty is a main goal for many real life problems. Statistical analysis for such data is an interested area for research. The aim of this paper is to introduce a new method concerning the generalization and modification of the rough set theory introduced early by Pawlak [Int. J. Comput. Inform. Sci. 11 (1982) 314

  6. Therapists in Oncology Settings

    Science.gov (United States)

    Hendrick, Susan S.

    2013-01-01

    This article describes the author's experiences of working with cancer patients/survivors both individually and in support groups for many years, across several settings. It also documents current best-practice guidelines for the psychosocial treatment of cancer patients/survivors and their families. The author's view of the important qualities…

  7. Autocatalytic sets in a partitioned biochemical network.

    Science.gov (United States)

    Smith, Joshua I; Steel, Mike; Hordijk, Wim

    2014-01-01

    In previous work, RAF theory has been developed as a tool for making theoretical progress on the origin of life question, providing insight into the structure and occurrence of self-sustaining and collectively autocatalytic sets within catalytic polymer networks. We present here an extension in which there are two "independent" polymer sets, where catalysis occurs within and between the sets, but there are no reactions combining polymers from both sets. Such an extension reflects the interaction between nucleic acids and peptides observed in modern cells and proposed forms of early life. We present theoretical work and simulations which suggest that the occurrence of autocatalytic sets is robust to the partitioned structure of the network. We also show that autocatalytic sets remain likely even when the molecules in the system are not polymers, and a low level of inhibition is present. Finally, we present a kinetic extension which assigns a rate to each reaction in the system, and show that identifying autocatalytic sets within such a system is an NP-complete problem. Recent experimental work has challenged the necessity of an RNA world by suggesting that peptide-nucleic acid interactions occurred early in chemical evolution. The present work indicates that such a peptide-RNA world could support the spontaneous development of autocatalytic sets and is thus a feasible alternative worthy of investigation.

  8. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  9. Formula student suspension setup and laptime simulation tool

    NARCIS (Netherlands)

    van den Heuvel, E.; Besselink, I.J.M.; Nijmeijer, H.

    2013-01-01

    In motorsports time is usually limited. With use of dedicated tools for measuring wheel alignment, camber, ride heights etc. setting up the car can be done fast and consistent. With the setup sequence and tools described in this report, progress has been made in the time it takes to set up the car.

  10. Developing a verification tool for calculations dissemination through COBAYA

    International Nuclear Information System (INIS)

    Sabater Alcaraz, A.; Rucabado Rucabado, G.; Cuervo Gomez, D.; Garcia Herranz, N.

    2014-01-01

    The development of a software tool that automates the comparison of results with previous versions of the code and results using models of accuracy is crucial for implementing the code new functionalities. The work presented here has been the generation the mentioned tool and the set of reference cases that have set up the afore mentioned matrix. (Author)

  11. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  12. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  13. Tools of online Marketing

    OpenAIRE

    Hossain, M. S.; Rahman, M. F.

    2017-01-01

    Abstract Online marketing is the most crucial issue in the modern marketing era but there was no previous research that could identify the tools of internet marketing before this study and it was the first study on the field of online marketing tools. This research was descriptive in nature and it has attempted to identify the major tools of internet marketing from the concepts of traditional marketing tools. Worldwide network is known as Internet that can exchange information between use...

  14. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  15. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  16. Nanocomposites for Machining Tools

    DEFF Research Database (Denmark)

    Sidorenko, Daria; Loginov, Pavel; Mishnaevsky, Leon

    2017-01-01

    Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials...

  17. Hesitant fuzzy sets theory

    CERN Document Server

    Xu, Zeshui

    2014-01-01

    This book provides the readers with a thorough and systematic introduction to hesitant fuzzy theory. It presents the most recent research results and advanced methods in the field. These includes: hesitant fuzzy aggregation techniques, hesitant fuzzy preference relations, hesitant fuzzy measures, hesitant fuzzy clustering algorithms and hesitant fuzzy multi-attribute decision making methods. Since its introduction by Torra and Narukawa in 2009, hesitant fuzzy sets have become more and more popular and have been used for a wide range of applications, from decision-making problems to cluster analysis, from medical diagnosis to personnel appraisal and information retrieval. This book offers a comprehensive report on the state-of-the-art in hesitant fuzzy sets theory and applications, aiming at becoming a reference guide for both researchers and practitioners in the area of fuzzy mathematics and other applied research fields (e.g. operations research, information science, management science and engineering) chara...

  18. Frame scaling function sets and frame wavelet sets in Rd

    International Nuclear Information System (INIS)

    Liu Zhanwei; Hu Guoen; Wu Guochang

    2009-01-01

    In this paper, we classify frame wavelet sets and frame scaling function sets in higher dimensions. Firstly, we obtain a necessary condition for a set to be the frame wavelet sets. Then, we present a necessary and sufficient condition for a set to be a frame scaling function set. We give a property of frame scaling function sets, too. Some corresponding examples are given to prove our theory in each section.

  19. Pickering tool management system

    International Nuclear Information System (INIS)

    Wong, E.H.; Green, A.H.

    1997-01-01

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  20. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  1. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  2. setsApp for Cytoscape: Set operations for Cytoscape Nodes and Edges [v2; ref status: indexed, http://f1000r.es/5lz

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2015-08-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. Automatic set partitioning and layout functions are also provided. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  3. Wilmar Planning Tool, user guide

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  4. Wilmar Planning Tool, user guide

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  5. Setting Goals for Achievement in Physical Education Settings

    Science.gov (United States)

    Baghurst, Timothy; Tapps, Tyler; Kensinger, Weston

    2015-01-01

    Goal setting has been shown to improve student performance, motivation, and task completion in academic settings. Although goal setting is utilized by many education professionals to help students set realistic and proper goals, physical educators may not be using goal setting effectively. Without incorporating all three types of goals and…

  6. Validation of the Australian Midwifery Standards Assessment Tool (AMSAT): A tool to assess midwifery competence.

    Science.gov (United States)

    Sweet, Linda; Bazargan, Maryam; McKellar, Lois; Gray, Joanne; Henderson, Amanda

    2018-02-01

    There is no current validated clinical assessment tool to measure the attainment of midwifery student competence in the midwifery practice setting. The lack of a valid assessment tool has led to a proliferation of tools and inconsistency in assessment of, and feedback on student learning. This research aimed to develop and validate a tool to assess competence of midwifery students in practice-based settings. A mixed-methods approach was used and the study implemented in two phases. Phase one involved the development of the AMSAT tool with qualitative feedback from midwifery academics, midwife assessors of students, and midwifery students. In phase two the newly developed AMSAT tool was piloted across a range of midwifery practice settings and ANOVA was used to compare scores across year levels, with feedback being obtained from assessors. Analysis of 150 AMSAT forms indicate the AMSAT as: reliable (Cronbach alpha greater than 0.9); valid-data extraction loaded predominantly onto one factor; and sensitivity scores indicating level of proficiency increased across the three years. Feedback evaluation forms (n=83) suggest acceptance of this tool for the purpose of both assessing and providing feedback on midwifery student's practice performance and competence. The AMSAT is a valid, reliable and acceptable midwifery assessment tool enables consistent assessment of midwifery student competence. This assists benchmarking across midwifery education programs. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  7. Social Set Visualizer

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Hussain, Abid; Vatrapu, Ravi

    2015-01-01

    -edge open source visual analytics libraries from D3.js and creation of new visualizations (ac-tor mobility across time, conversational comets etc). Evaluation of the dashboard consisting of technical testing, usability testing, and domain-specific testing with CSR students and yielded positive results.......This paper presents a state-of-the art visual analytics dash-board, Social Set Visualizer (SoSeVi), of approximately 90 million Facebook actions from 11 different companies that have been mentioned in the traditional media in relation to garment factory accidents in Bangladesh. The enterprise...

  8. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels

    2015-01-01

    of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...

  9. Markov set-chains

    CERN Document Server

    Hartfiel, Darald J

    1998-01-01

    In this study extending classical Markov chain theory to handle fluctuating transition matrices, the author develops a theory of Markov set-chains and provides numerous examples showing how that theory can be applied. Chapters are concluded with a discussion of related research. Readers who can benefit from this monograph are those interested in, or involved with, systems whose data is imprecise or that fluctuate with time. A background equivalent to a course in linear algebra and one in probability theory should be sufficient.

  10. Studies of prehistoric flint tools by PIXE

    International Nuclear Information System (INIS)

    Smit, Z.

    2002-01-01

    The trace elements preserved on sharp edges of stone tools may provide some information about the worked material, which in turn may serve for the reconstruction of users way of life. Since the amount of the deposited worked material is minute, it can only be detected by sensitive fluorescence techniques, induced by electrons in the electron microscopes, or by light ions from the particle accelerators (PIXE). The trace element deposition was studied by PIXE for a set of experimental tools used for working bone and wood, and for a set of archaeological artefacts dating from the late paleolithic till neolithic period. (author)

  11. Tools and data services registry

    DEFF Research Database (Denmark)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across...... a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task...

  12. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  13. PhysarumSoft: An update based on rough set theory

    Science.gov (United States)

    Schumann, Andrew; Pancerz, Krzysztof

    2017-07-01

    PhysarumSoft is a software tool consisting of two modules developed for programming Physarum machines and simulating Physarum games, respectively. The paper briefly discusses what has been added since the last version released in 2015. New elements in both modules are based on rough set theory. Rough sets are used to model behaviour of Physarum machines and to describe strategy games.

  14. Set-Up and Punchline as Figure and Ground

    DEFF Research Database (Denmark)

    Keisalo, Marianna Päivikki

    the two that cannot be resolved by appeal to either set-up or punchline, but traps thought between them in an ‘epistemological problem’ as comedian Louis CK put it. For comedians, set-ups and punchlines are basic tools, practical and concrete ways to create and organize material. They are also familiar...

  15. Development of the status of W and T for the realization of a long-term safety demonstration for the final repository using the examples VSG and Konrad. Report on the Working package 2. Review and development of safety-related assessments of disposal facilities of wastes with negligible heat generation; development and provision of the necessary set of tools using the example of the final repository Konrad

    International Nuclear Information System (INIS)

    Larue, Juergen; Fischer-Appelt, Klaus; Hartwig-Thurat, Eva

    2015-09-01

    In the research project on the ''Review and development of safety-related assessments of disposal facilities with negligible heat generation; development and provision of the necessary set of tools, using the example of the Konrad disposal facility'' (3612R03410), the state of the art in science and technology of the safety-related assessments and sets of tools for building a safety case was examined. The reports pertaining to the two work packages described the further development of the methodology for accident analyses (WP 1) and of building a safety case (WP 2); also, comparisons were drawn on a national and international scale with the methods applied in the licensing procedure of the Konrad disposal facility. A safety case as well as its underlying analyses and methods always has to be brought up to date with the development of the state of the art in science and technology. In Germany, two safety cases regarding the long-term safety of disposal facilities have been prepared. These are the licensing documentation for the Konrad disposal facility in the year 1990 and the research project regarding the preliminary safety case for the Gorleben site (Vorlaeufige Sicherheitsanalyse Gorleben - VSG) in the year 2013, both reflecting the state of development of building a safety case at the respective time. Comparing the two above-mentioned examples of safety cases and taking recent international recommendations and national regulations into account, this report on Work Package 2 presents the development of the international state of the art in science and technology. This has been done by summarising the essential differences and similarities of each element of the safety case for the Konrad disposal facility on the one hand and the VSG and the international status on the other hand.

  16. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2008-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  17. Analysis of successive data sets

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Breeuwer, Marcel; Haselhoff, Eltjo Hans

    2002-01-01

    The invention relates to the analysis of successive data sets. A local intensity variation is formed from such successive data sets, that is, from data values in successive data sets at corresponding positions in each of the data sets. A region of interest is localized in the individual data sets on

  18. Goal setting: an integral component of effective diabetes care.

    Science.gov (United States)

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  19. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  20. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  1. ANALYSIS OF FORMING TREAD WHEEL SETS

    Directory of Open Access Journals (Sweden)

    Igor IVANOV

    2017-09-01

    Full Text Available This paper shows the results of a theoretical study of profile high-speed grinding (PHSG for forming tread wheel sets during repair instead of turning and mold-milling. Significant disadvantages of these methods are low capacity to adapt to the tool and inhomogeneous structure of the wheel material. This leads to understated treatment regimens and difficulties in recovering wheel sets with thermal and mechanical defects. This study carried out modeling and analysis of emerging cutting forces. Proposed algorithms describe the random occurrence of the components of the cutting forces in the restoration profile of wheel sets with an inhomogeneous structure of the material. To identify the statistical features of randomly generated structures fractal dimension and the method of random additions were used. The multifractal spectrum formed is decomposed into monofractals by wavelet transform. The proposed method allows you to create the preconditions for controlling the parameters of the treatment process.

  2. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...... as named functions in Scheme. Finally, the Scheme Elucidator is able to integrate SchemeDoc resources as part of an internal documentation resource....

  3. Volume Sculpting Using the Level-Set Method

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2002-01-01

    In this paper, we propose the use of the Level--Set Method as the underlying technology of a volume sculpting system. The main motivation is that this leads to a very generic technique for deformation of volumetric solids. In addition, our method preserves a distance field volume representation....... A scaling window is used to adapt the Level--Set Method to local deformations and to allow the user to control the intensity of the tool. Level--Set based tools have been implemented in an interactive sculpting system, and we show sculptures created using the system....

  4. Dynamical basis set

    International Nuclear Information System (INIS)

    Blanco, M.; Heller, E.J.

    1985-01-01

    A new Cartesian basis set is defined that is suitable for the representation of molecular vibration-rotation bound states. The Cartesian basis functions are superpositions of semiclassical states generated through the use of classical trajectories that conform to the intrinsic dynamics of the molecule. Although semiclassical input is employed, the method becomes ab initio through the standard matrix diagonalization variational method. Special attention is given to classical-quantum correspondences for angular momentum. In particular, it is shown that the use of semiclassical information preferentially leads to angular momentum eigenstates with magnetic quantum number Vertical BarMVertical Bar equal to the total angular momentum J. The present method offers a reliable technique for representing highly excited vibrational-rotational states where perturbation techniques are no longer applicable

  5. Developing Expert Tools for the LHC

    CERN Document Server

    AUTHOR|(CDS)2160780; Timkó, Helga

    2017-10-12

    This Thesis describes software tools developed for automated, precision setting-up of low-power level radio frequency (LLRF) loops, which will help expert users to have better control and faster setting-up of the radio-frequency (RF) system in the Large Hadron Collider (LHC) experiment. The aim was to completely redesign the software architecture, to add new features, to improve certain algorithms, and to increase the automation.

  6. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  7. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  8. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  9. Easy QC 7 tools

    International Nuclear Information System (INIS)

    1981-04-01

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  10. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  11. Easy QC 7 tools

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-04-15

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  12. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  13. Setting research priorities by applying the combined approach matrix.

    Science.gov (United States)

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  14. More on neutrosophic soft rough sets and its modification

    Directory of Open Access Journals (Sweden)

    Emad Marei

    2015-12-01

    Full Text Available This paper aims to introduce and discuss anew mathematical tool for dealing with uncertainties, which is a combination of neutrosophic sets, soft sets and rough sets, namely neutrosophic soft rough set model. Also, its modification is introduced. Some of their properties are studied and supported with proved propositions and many counter examples. Some of rough relations are redefined as a neutrosophic soft rough relations. Comparisons among traditional rough model, suggested neutrosophic soft rough model and its modification, by using their properties and accuracy measures are introduced. Finally, we illustrate that, classical rough set model can be viewed as a special case of suggested models in this paper.

  15. setsApp: Set operations for Cytoscape Nodes and Edges [v1; ref status: indexed, http://f1000r.es/3ml

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2014-07-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  16. HDF-EOS Dump Tools

    Science.gov (United States)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The

  17. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  18. Data Tools and Apps

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My ). Business Dynamics Statistics This tool shows tabulations on establishments, firms, and employment with

  19. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  20. Design mentoring tool.

    Science.gov (United States)

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers : mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves se...

  1. Tool Storage Problem Solved!

    Science.gov (United States)

    Klenke, Andrew M.; Dell, Tim W.

    2007-01-01

    Graduates of the automotive technology program at Pittsburg State University (PSU) generally enter the workforce in some type of automotive management role. As a result, the program does not require students to purchase their own tools, and it does not have room for all 280 majors to roll around a personal tool chest. Each instructor must maintain…

  2. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  3. Setting the scene

    International Nuclear Information System (INIS)

    Curran, S.

    1977-01-01

    The reasons for the special meeting on the breeder reactor are outlined with some reference to the special Scottish interest in the topic. Approximately 30% of the electrical energy generated in Scotland is nuclear and the special developments at Dounreay make policy decisions on the future of the commercial breeder reactor urgent. The participants review the major questions arising in arriving at such decisions. In effect an attempt is made to respond to the wish of the Secretary of State for Energy to have informed debate. To set the scene the importance of energy availability as regards to the strength of the national economy is stressed and the reasons for an increasing energy demand put forward. Examination of alternative sources of energy shows that none is definitely capable of filling the foreseen energy gap. This implies an integrated thermal/breeder reactor programme as the way to close the anticipated gap. The problems of disposal of radioactive waste and the safeguards in the handling of plutonium are outlined. Longer-term benefits, including the consumption of plutonium and naturally occurring radioactive materials, are examined. (author)

  4. Ready, set, move!

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    This year, the CERN Medical Service is launching a new public health campaign. Advertised by the catchphrase “Move! & Eat Better”, the particular aim of the campaign is to encourage people at CERN to take more regular exercise, of whatever kind.   The CERN annual relay race is scheduled on 24 May this year. The CERN Medical Service will officially launch its “Move! & Eat Better” campaign at this popular sporting event. “We shall be on hand on the day of the race to strongly advocate regular physical activity,” explains Rachid Belkheir, one of the Medical Service doctors. "We really want to pitch our campaign and answer any questions people may have. Above all we want to set an example. So we are going to walk the same circuit as the runners to underline to people that they can easily incorporate movement into their daily routine.” An underlying concern has prompted this campaign: during their first few year...

  5. A systematic and practical method for selecting systems engineering tools

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2017-01-01

    analyses of the actual needs and the available tools. Grouping needs into categories, allow us to obtain a comprehensive set of requirements for the tools. The entire model-based systems engineering discipline was categorized for a modeling tool case to enable development of a tool specification...... in successful operation since 2013 at GN Hearing. We further utilized the method to select a set of tools that we used on pilot cases at GN Hearing for modeling, simulating and formally verifying embedded systems.......The complexity of many types of systems has grown considerably over the last decades. Using appropriate systems engineering tools therefore becomes increasingly important. Starting the tool selection process can be intimidating because organizations often only have a vague idea about what they need...

  6. Expert tool use

    DEFF Research Database (Denmark)

    Thorndahl, Kathrine Liedtke; Ravn, Susanne

    2017-01-01

    on a case study of elite rope skipping, we argue that the phenomenological concept of incorporation does not suffice to adequately describe how expert tool users feel when interacting with their tools. By analyzing a combination of insights gained from participant observation of 11 elite rope skippers......According to some phenomenologists, a tool can be experienced as incorporated when, as a result of habitual use or deliberate practice, someone is able to manipulate it without conscious effort. In this article, we specifically focus on the experience of expertise tool use in elite sport. Based...... and autoethnographic material from one former elite skipper, we take some initial steps toward the development of a more nuanced understanding of the concept of incorporation; one that is able to accommodate the experiences of expert tool users. In sum, our analyses indicate that the possibility for experiencing...

  7. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  8. Reducing Information Overload in Large Seismic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research

  9. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  10. Simulation tools for detector and instrument design

    DEFF Research Database (Denmark)

    Kanaki, Kalliopi; Kittelmann, Thomas; Cai, Xiao Xiao

    2018-01-01

    The high performance requirements at the European Spallation Source have been driving the technological advances on the neutron detector front. Now more than ever is it important to optimize the design of detectors and instruments, to fully exploit the ESS source brilliance. Most of the simulation...... a powerful set of tools to tailor the detector and instrument design to the instrument application....

  11. Set discrimination of quantum states

    International Nuclear Information System (INIS)

    Zhang Shengyu; Ying Mingsheng

    2002-01-01

    We introduce a notion of set discrimination, which is an interesting extension of quantum state discrimination. A state is secretly chosen from a number of quantum states, which are partitioned into some disjoint sets. A set discrimination is required to identify which set the given state belongs to. Several essential problems are addressed in this paper, including the condition of perfect set discrimination, unambiguous set discrimination, and in the latter case, the efficiency of the discrimination. This generalizes some important results on quantum state discrimination in the literature. A combination of state and set discrimination and the efficiency are also studied

  12. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Schulz, M.

    2012-01-01

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  13. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  14. Setting priorities for ambient air quality objectives

    International Nuclear Information System (INIS)

    2004-10-01

    Alberta has ambient air quality objectives in place for several pollutants, toxic substances and other air quality parameters. A process is in place to determine if additional air quality objectives are required or if existing objectives should be changed. In order to identify the highest priority substances that may require an ambient air quality objective to protect ecosystems and public health, a rigorous, transparent and cost effective priority setting methodology is required. This study reviewed, analyzed and assessed successful priority setting techniques used by other jurisdictions. It proposed an approach for setting ambient air quality objective priorities that integrates the concerns of stakeholders with Alberta Environment requirements. A literature and expert review were used to examine existing priority-setting techniques used by other jurisdictions. An analysis process was developed to identify the strengths and weaknesses of various techniques and their ability to take into account the complete pathway between chemical emissions and damage to human health or the environment. The key strengths and weaknesses of each technique were identified. Based on the analysis, the most promising technique was the tool for the reduction and assessment of chemical and other environmental impacts (TRACI). Several considerations for using TRACI to help set priorities for ambient air quality objectives were also presented. 26 refs, 8 tabs., 4 appendices

  15. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  16. Special set linear algebra and special set fuzzy linear algebra

    OpenAIRE

    Kandasamy, W. B. Vasantha; Smarandache, Florentin; Ilanthenral, K.

    2009-01-01

    The authors in this book introduce the notion of special set linear algebra and special set fuzzy Linear algebra, which is an extension of the notion set linear algebra and set fuzzy linear algebra. These concepts are best suited in the application of multi expert models and cryptology. This book has five chapters. In chapter one the basic concepts about set linear algebra is given in order to make this book a self contained one. The notion of special set linear algebra and their fuzzy analog...

  17. CSS Preprocessing: Tools and Automation Techniques

    Directory of Open Access Journals (Sweden)

    Ricardo Queirós

    2018-01-01

    Full Text Available Cascading Style Sheets (CSS is a W3C specification for a style sheet language used for describing the presentation of a document written in a markup language, more precisely, for styling Web documents. However, in the last few years, the landscape for CSS development has changed dramatically with the appearance of several languages and tools aiming to help developers build clean, modular and performance-aware CSS. These new approaches give developers mechanisms to preprocess CSS rules through the use of programming constructs, defined as CSS preprocessors, with the ultimate goal to bring those missing constructs to the CSS realm and to foster stylesheets structured programming. At the same time, a new set of tools appeared, defined as postprocessors, for extension and automation purposes covering a broad set of features ranging from identifying unused and duplicate code to applying vendor prefixes. With all these tools and techniques in hands, developers need to provide a consistent workflow to foster CSS modular coding. This paper aims to present an introductory survey on the CSS processors. The survey gathers information on a specific set of processors, categorizes them and compares their features regarding a set of predefined criteria such as: maturity, coverage and performance. Finally, we propose a basic set of best practices in order to setup a simple and pragmatic styling code workflow.

  18. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  19. Machine tool evaluation

    International Nuclear Information System (INIS)

    Lunsford, B.E.

    1976-01-01

    Continued improvement in numerical control (NC) units and the mechanical components used in the construction of today's machine tools, necessitate the use of more precise instrumentation to calibrate and determine the capabilities of these systems. It is now necessary to calibrate most tape-control lathes to a tool-path positioning accuracy of +-300 microinches in the full slide travel and, on some special turning and boring machines, a capability of +-100 microinches must be achieved. The use of a laser interferometer to determine tool-path capabilities is described

  20. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  1. FACET CLASSIFICATIONS OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2013-12-01

    Full Text Available The article deals with the classification of e-learning tools based on the facet method, which suggests the separation of the parallel set of objects into independent classification groups; at the same time it is not assumed rigid classification structure and pre-built finite groups classification groups are formed by a combination of values taken from the relevant facets. An attempt to systematize the existing classification of e-learning tools from the standpoint of classification theory is made for the first time. Modern Ukrainian and foreign facet classifications of e-learning tools are described; their positive and negative features compared to classifications based on a hierarchical method are analyzed. The original author's facet classification of e-learning tools is proposed.

  2. Measuring social exclusion in healthcare settings: a scoping review.

    Science.gov (United States)

    O'Donnell, Patrick; O'Donovan, Diarmuid; Elmusharaf, Khalifa

    2018-02-02

    Social exclusion is a concept that has been widely debated in recent years; a particular focus of the discussion has been its significance in relation to health. The meanings of the phrase "social exclusion", and the closely associated term "social inclusion", are contested in the literature. Both of these concepts are important in relation to health and the area of primary healthcare in particular. Thus, several tools for the measurement of social exclusion or social inclusion status in health care settings have been developed. A scoping review of the peer-reviewed and grey literature was conducted to examine tools developed since 2000 that measure social exclusion or social inclusion. We focused on those measurement tools developed for use with individual patients in healthcare settings. Efforts were made to obtain a copy of each of the original tools, and all relevant background literature. All tools retrieved were compared in tables, and the specific domains that were included in each measure were tabulated. Twenty-two measurement tools were included in the final scoping review. The majority of these had been specifically developed for the measurement of social inclusion or social exclusion, but a small number were created for the measurement of other closely aligned concepts. The majority of the tools included were constructed for engaging with patients in mental health settings. The tools varied greatly in their design, the scoring systems and the ways they were administered. The domains covered by these tools varied widely and some of the tools were quite narrow in the areas of focus. A review of the definitions of both social inclusion and social exclusion also revealed the variations among the explanations of these complex concepts. There are several definitions of both social inclusion and social exclusion in use and they differ greatly in scope. While there are many tools that have been developed for measuring these concepts in healthcare settings, these

  3. A CLASSIC FRAMEWORK OF ONLINE MARKETING TOOLS

    Directory of Open Access Journals (Sweden)

    Popa Adela Laura

    2015-07-01

    Full Text Available The present paper starts from the assumption that there is a tendency, especially among practitioners, to largely overlap concepts of online marketing and online advertising, thus considering that most online marketing tools aim at the aspect of value communication and promotion. This observation prompted us to try to delineate the categories of online marketing tools according to the traditional areas of marketing activity. Therefore, the paper aims to present the online marketing tools based on a different vision than the literature identified so far. Thus, it was intended to group the online marketing tools on the key components of the marketing activity and the presentation, for each, of certain software tools that support that. The way in which the analysis of online marketing tools was addressed is new and could be useful for defining a structured vision on the field. The paper aims both to analyze concepts specific to online marketing, and especially to carry out a delineation of categories of online marketing tools based on the key areas of marketing such as value creation, value delivery, value communication / promotion, customer relationship management and marketing research. To achieve the goal set for this paper we considered useful to address the issue from a dual perspective: from the perspective of the academic literature - books, studies found in scientific databases - which deal with the topic of online marketing and online marketing tools; and from the perspective of practitioners - studies posted on the Internet by the specialists in the field, respectively the analysis of websites of companies providing online marketing services. The intention was to combine the vision specific to theorists to that of practitioners in tackling the field specific to online marketing and online marketing tools. In order to synthesize the information presented in this paper, we also conducted a visual representation of the categories of online

  4. Gaussian process regression for tool wear prediction

    Science.gov (United States)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  5. Studying the Complex Expression Dependences between Sets of Coexpressed Genes

    Directory of Open Access Journals (Sweden)

    Mario Huerta

    2014-01-01

    Full Text Available Organisms simplify the orchestration of gene expression by coregulating genes whose products function together in the cell. The use of clustering methods to obtain sets of coexpressed genes from expression arrays is very common; nevertheless there are no appropriate tools to study the expression networks among these sets of coexpressed genes. The aim of the developed tools is to allow studying the complex expression dependences that exist between sets of coexpressed genes. For this purpose, we start detecting the nonlinear expression relationships between pairs of genes, plus the coexpressed genes. Next, we form networks among sets of coexpressed genes that maintain nonlinear expression dependences between all of them. The expression relationship between the sets of coexpressed genes is defined by the expression relationship between the skeletons of these sets, where this skeleton represents the coexpressed genes with a well-defined nonlinear expression relationship with the skeleton of the other sets. As a result, we can study the nonlinear expression relationships between a target gene and other sets of coexpressed genes, or start the study from the skeleton of the sets, to study the complex relationships of activation and deactivation between the sets of coexpressed genes that carry out the different cellular processes present in the expression experiments.

  6. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  7. Tool Inventory and Replacement

    Science.gov (United States)

    Bear, W. Forrest

    1976-01-01

    Vocational agriculture teachers are encouraged to evaluate curriculum offerings, the new trends in business and industry, and develop a master tool purchase and replacement plan over a 3- to 5-year period. (HD)

  8. ATO Resource Tool -

    Data.gov (United States)

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  9. Water Budget Tool

    Science.gov (United States)

    If you're designing a new landscape or rethinking your current one, the WaterSense Water Budget Tool will tell you if you have designed a landscape that will use an appropriate amount of water for your climate.

  10. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  11. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  12. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  13. Financing Alternatives Comparison Tool

    Science.gov (United States)

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  14. Personal Wellness Tools

    Science.gov (United States)

    ... of Personal Stories Peers Celebrating Art Peers Celebrating Music Be Vocal Support Locator DBSA In-Person Support ... With this tool, you can track key health trends related to the following: Overall mood Mood disorder ...

  15. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  16. Chemical Data Access Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  17. Tools and their uses

    CERN Document Server

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  18. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  19. Learning Design Tools

    NARCIS (Netherlands)

    Griffiths, David; Blat, Josep; Garcia, Rocío; Vogten, Hubert; Kwong, KL

    2005-01-01

    Griffiths, D., Blat, J., Garcia, R., Vogten, H. & Kwong, KL. (2005). Learning Design Tools. In: Koper, R. & Tattersall, C., Learning Design: A Handbook on Modelling and Delivering Networked Education and Training (pp. 109-136). Berlin-Heidelberg: Springer Verlag.

  20. Clean Energy Finance Tool

    Science.gov (United States)

    State and local governments interested in developing a financing program can use this Excel tool to support energy efficiency and clean energy improvements for large numbers of buildings within their jurisdiction.